Image: Flickr / Kyle McDonald

Image: Flickr / Kyle McDonald

Last year about this time I gave a talk in the Netherlands about the problems of what I called "flatpack futures," or one-sized-fits-all visions of the future that attempt to offer a holistic solution to human needs, but which do so in a fashion that requires little or no agency of the actors who might inhabit such a future. I intentionally used the term "flatpack" to describe them as an end product, something we are asked to consume, like a takeway meal or low-cost home goods, with little or no customization or variation. They are "flat" in part because they largely omit or mask the more complicated contexts and implications that they nonetheless embody.

After all, an Ikea chair, to take a literal flatpack product, expresses or connects to little or nothing about the context in which it is produced or into which it is sold. It's a multi-purpose aspirational widget—it offers an acceptable image, partially completes a picture for you, summoning forth a vague sense of Europa, a clean, minimal, efficient apartment somewhere in a cool climate, with you in it. But it also largely masks its own supply chain, environmental costs, design history, and so on. It just is. And so does a flatpack future. Just clean, beautiful, and context-free, frozen in the moment of frictionless forever.

Is a simplified version of a future ever acceptable then? Yes, but it's helpful to have a frame in which to describe it. For that, a metaphor from the world of technology is useful: lossy. If you've imported music for use with a digital music player, you've probably come across this term. High quality digital music (or video or other forms of high-fidelity data) are typically huge files. To be as true to reality as is digitally possible, they contain many more bits of data. But moving these packages of simulated reality from one device to the next means consuming a lot of bandwidth and storage. Lossy compression technology was developed as a means of cutting expendable bits of data out of a file in order to shrink the overall file size and make it easier to transmit and store. 

Lossy compression works in part because of a unique property of the human brain. Even if our ears don't hear the full spectrum of sound, or our eyes don't see every pixel, we are largely able to fill in the blanks. We don't need to know the whole story, or have it painted for us—doing so either overloads us or limits our ability to situate the story in ways we can make sense of.

So, while flatpack futures attempt to deliver a whole world, system or universe embedded in one short vignette, lossy futures—be they artifacts, simple scenarios, wireframes of speculation, rich prompts, brief vignettes or some other material object—give us the scaffolding and ask or allow us to determine the details ourselves. In doing so, they transmit the critical data, the minimum viable future, and give us the opportunity to fill in the gaps we think are important to understanding, or have a dialogue around what these gaps may mean. 

The irony here is that flatpack futures are often high fidelity productions, complex, if flawed, narratives. They are beautiful renderings, but submerge engineering, social, business model, ethical or spiritual problems in favor of presenting a glossy face. Lossy futures are lo-fi, and intentionally omit detail as a feature, not a bug. 

Occasionally after a workshop or other activity I'll get feedback from other practitioners saying "Hey, you missed a step," or "You didn't deliberate on this long enough." That's fair if a particular methodology is being misrepresented, or important facts or steps are being intentionally cloaked. But I find, more and more, that one of the most effective ways to engage a broad audience is to work with these lossy futures, focusing on the core idea, future, or tension, and allowing the detail to rendered as necessary. Like rapid prototyping in design, lossy futures help ideas move faster, more hands touch them, and a wider range of implications be explored. It's not a perfect metaphor, but imperfect seems to work. 

Comment