Introduction to digital tools involves reading about and experimenting with text mining, mapping, and network visualization. The purpose of these tools is to invite the researcher to study texts differently than by seeing them as discrete texts closely, or singly, associated with their author. Arguably starting with the work of Edward Said or Jacques Derrida, during the “literary turn” in humanities scholarship in the late twentieth century, to study a text, has meant a close focus on its content for meanings, intended and unintended. Texts were placed in context, but that contextualization has been inferential, or conjectural, based on the scholar’s deep reading of other important texts contemporaneous with or preceding the work under scrutiny.
In breaking with its methodological predecessor, digital scholarship offers both perils and promise. Text mining, using Voyant, allows study of texts to reveal hidden emphases and word patterns. Mapping, using kepler.gl, allows visualization of texts’ physical location over time. Network visualization, using Palladio, reveals patterns of relationships in large sets of text data. In general, these three platforms beckon the researcher to think about texts as interdependent, not independent, humanistic productions, as Scott Weingart wrote in a blog way back in 2011, “Demystifying Networks.”
For each of these tools we used the same data set, the WPA slave narratives, to uncover different types of connections within the interviews of former slaves. By using Voyant, we were able to conduct text mining, which reveals word frequency and word contextualization. I studied derivatives of the word “master,” and usage frequency of the words “slavery” and “freedom,” in different former slave states. This allowed me to make some conjectures about whether there were differences between upper and lower southern states in terms of former slaves’ explicitly dicussing their passage from property to person, and to note variances of pronunciation – not accounting for possible over-editing by WPA interviewers or editors.
Through kepler.gl, I was able to map the location of where slave narrative interviews occurred, and place them in relation to where the interview subjects, the former slaves, had been enslaved. I found that a surprising number of former slaves interviewed in Alabama had been enslaved elsewhere. This raised the questions, of course, about why they had migrated, and what the effects were of their migration on their memory, particularly in light, again, of its refraction through the transcription skills and decisions of the WPA interviewers.
(A nice thing to do to try to integrate Voyant and kepler.gl, in retrospect, would be to try to map usages of derivatives of the word “master” among Alabama interview subjects, and correlate those usages with their places of enslavement.)
Finally, through Palladio, I created various networks of relationships between the former slaves interviewed in Alabama for the WPA slave narratives collection, and various data about their past, gender, age, and persons interviewing them. The fact that I used Palladio to study the oral testimony of people – black people born as slaves – practically the opposite of those for whom it was originally designed – highbrow Europeans and Anglo-Americans engaged in the discourse of the Enlightenment – testifies to the breadth of its accessibility.
With the exception of being baffled with how to write a code to enable my kepler.gl map to display properly in my blog, I found Palladio a bit more intimidating than the other two platforms, if only in being able to interpret the relationships of data sets it provided. That is a reminder about the peril of the digital turn in humanities scholarship. All three of these forms of technology should be seen as the tool, not outcome, of the new humanist scholar’s inquiry about what texts mean: the power of a visual display of a network of texts, whether of intellectuals of the Enlightenment, or humble Americans who survived slavery, is limited by what the user is able to discern from the display. On the other hand, the interactivity that these platforms offer users is inherently democratic, in allowing and encouraging users to learn how to manipulate and interpret their own data for their own research questions.