New Insights, Old Baggage

Successful engagement with data is often described at ‘unleashing’ it or ‘unlocking’ it. Inferred is the power or utility latent in data, just waiting for someone or something to allow it to emerge. Increasingly, that something is AI or Machine Learning. Software built precisely to see things that we humans cannot.

Recently, Google announced that it was using its cloud and machine learning tools to help the New York Times find new insights locked in their some five to seven million physical images. It’s a project which chimes with Google’s mission ‘to organise the world’s information and make it universally accessible and useful.’

NYT-Pen

The story so far is one of technology and improved search capabilities. How exactly does the promise of new stories get fulfilled? The focus on technology obscures the continued and deep need for human labour and imagination to tell new stories. Time to scour and make/draw sense from images, whether digital or not.

Not only are people are interpreters of images they are also traditionally filters of images. What are the ramifications of allowing software to be the classifier and the filter?

This is a question asked by Safiya Noble in Algorithms of Oppression. Noble argues that search engines (specifically Google) reinforce racist and negative stereotypes. For example by showing searchers what “professor style” looks like.

professor-style

What each search represents is Google’s algorithmic conceptualizations of a variety of people and ideas. Whether looking for autosuggestions or “answers” to various questions or looking for notions about what is beautiful or what a professor may look like (which does not account for people who look like me who are part of the professoriate — so much for “personalization”), Google’s dominant narratives reflect the kinds of hegemonic frameworks and notions that are often resisted by women and people of color. Google Has a Striking History of Bias Against Black Girls.

Work like Noble’s refuses to let technology be transparent, value-neutral, or the centre of attention. It critically tempers the claimed ease of unlocking data and generating new stories with the unavoidable hard human work of understanding tools, data, and their baggage.