David King – THATCamp British Library Labs http://britishlibrarylabs2015.thatcamp.org 13 February 2015, British Library Conference Centre Fri, 13 Feb 2015 07:20:50 +0000 en-US hourly 1 https://wordpress.org/?v=4.9.12 Getting experienced http://britishlibrarylabs2015.thatcamp.org/2015/02/05/getting-experienced/ Thu, 05 Feb 2015 14:20:56 +0000 http://britishlibrarylabs2015.thatcamp.org/?p=191 Continue reading ]]>

Digitised literature gives us the opportunity to explore how people experienced the past. However, if extracting tangible things from texts such as names and places can be tricky, consider the challenge of extracting something intangible such as an experience.

If the author has been kind to us, we can set our computer to look for keywords in the text such as ‘read’ and ‘listen’. We can use those words as cues to locate the description of an experience. For example, an officer in the Western Front trenches might describe the solace he finds when he reads Jane Austen; but what if his diary simply states that he grabbed Austen from his pack for some solace? How can we program a computer to extract that as an experience from the text, regardless of how the sentiment is expressed?

In this session, we would like to explore the challenge of extracting experience, in whatever form, from digital texts. Can we afford to have our valuable humanists wading through reams of data before they can get to grips with the real purpose of their study, or can we get the computer to tackle the issue of finding candidate experiences that merit close reading?

While our own immediate interests lie in historic literature, we believe the underlying challenge is equally applicable to extracting experiences from other digital sources up to and including contemporary social media.

The primary aim of the session is to discuss the challenge, and from that seek to establish some general principles to automate the identification of ‘experience’ in digital texts. We can initiate the discussion drawing from our own work with reading experience (www.open.ac.uk/Arts/reading) and listening experience (www.open.ac.uk/Arts/LED).

If we make good progress with the discussion, and if time permits, we can extend the session to try out some of the ideas and any tools that may already exist. We will bring along some of our data and a suitably powerful laptop to start the work off, but we would love to see more and varied data, and to see any existing tools, to progress addressing this challenge.

]]>