The ever-increasing sophistication of digital technology can be both a blessing and a curse for humanities scholars. It is a blessing, most obviously, because digital archives allow for the preservation and mass distribution of ancient texts, cumbersome works of art and the like on an unprecedented scale. Of course, that entails that there is an overwhelming quantity of information to sift through when one attempts to understand a certain aspect of human culture. Even with large teams of researchers, traditional techniques simply might not suffice. To that end, a new generation of scholars is developing the tools of cultural analytics, a set of computational and visualization methods capable of making sense of massive data flow. This is a relatively new field, pioneered by Lev Manovich in 2005 and explicitly recognized two years later. The team behind the Software Studies Initiative does not see this as a continuation of traditional humanities concepts but an opportunity to question them with “big data.” There are several staples of a successful cultural analytics project. The first is that it elucidates an interesting or novel argument and, just as important, clearly presents it to the audience. The second is that the project requires little technical knowledge to understand. Lastly, there is something to be said for aesthetics. Below I will provide two examples of cultural analytics projects to show the wrong and right ways to approach this field.
Our first case study is Voyant Tools, a site that attempts to give its audience insights into classic texts of literature. I experimented with the site’s page on Jane Austin’s Sense and Sensibility. One’s first impression of the site is rather intimidating as it brings up an overly technical display of graphs, word frequencies and the novel itself displayed in one massive, scrolling panel. To the site’s credit, it provides its audience with powerful tools for locating and measuring the frequency of specific words. This is done quickly and easily. Still, there is no coherent message. While the site attempts to distill its algorithms to present the most frequent words, these end up being proper nouns, prepositions and the like. One must search for oneself to maybe stumble upon meaningful patterns (though I have found none yet). However, one is unlikely to bother as it is never explained why one should care about the relative frequency of words.
A more worthy example can be found in “Digging in to Global News,” one of the many projects conducted by the Software Studies Initiative. This project analyzed the objects visible on-screen in each of the 113 Weekly Addresses that President Obama had made up through April of 20011. If one was not already intrigued, the site reminds the audience that media of any U.S. President is historically significant. It references the scholarly work that has been done regarding the implications of Obama’s speeches covering a variety of topics from race relations to gender and class studies. While this site is not quite as interactive as Voyant Tools, this can be seen as a good thing because the work has already been done for the audience. Pictorial montages of the speeches are interspersed with text and graphs to concisely deliver information. With a simple scroll through the team’s findings one can see clearly the material culture associated with the presidency. Flags, lamps and bouquets are the most common while mirrors and rugs seldom appear. The team also presents the topics addressed in each speech, showing how the economy is paramount while matters of civil rights are conspicuously muted. The project is short, concise, well presented and informative all at the same time.