3542316707_cb621eedfc_o.png

Over the summer I was asked by WIRED UK to write a short piece of commentary based on my talk from The Next Web earlier in the year about data, machines and storytelling. That piece, for the Idea Lab section, just went live, and was featured in the October print edition. I've spoken on the topic from different angles before.

In the piece, "Will anyone care when a robot wins the Man Booker prize?" I talk about the gradually growing base of content in the world around us, both digital and physical media, that is generated completely or in part by code. From tweets to financial releases to news stories to novels, artwork and comics, a small but important amount of media is slipping into our clickstreams that isn't made by people. And because the growth is dispersed and gradual, relative to overall content production, we aren't noticing it. Could you tell a social media message posted by a bot vs one created by a person? That distinction is becoming harder to make. And as content providers rely more and more on highly detailed analytics to determine what content we value, this data will go to inform more non-human content production. 

We're also becoming accustomed to the glitchiness of mass-produced content, and more atuned to and forgiving of the quirks and textures of machine-driven communication, as Alexis Lloyd has pointed out. Having conversations with algorithms is a step toward not just tuning but consuming what they produce. Given the pace of development in natural language processing, machine learning, and data analytics, getting to an acceptable, entertaining level of content for mass consumption is not as hard a task as it may have looked a few years ago. 

Beyond this, I'd wager we will grow more interested in a machinic point of view, and become more curious to understand how code interprets the world and reads it back to us. Without going too far into object-oriented ontology, how machines see the world—and play it back to us—tells us a lot about the quality of the instructions we've given them, based on our own understanding of ourselves—what society values. Will they only tell stories that reflect the experiences of their creators, a frequently cited fault with many algorithms that are creeping into our lives, or will they see something we don't yet perceive, particularly as they roam the world autonomously? Time will tell, but as we approach the latter, things will get interesting.  

Comment