This post was originally featured on Medium and Gizmodo.
Technology Review ran an article this past week about a new app called PlaceAvoider that is being developed by computer scientists working at Indiana University. In a nutshell, PlaceAvoider allows the user of a first-person lifelogging device to essentially blacklist locations from visual storage and sharing by matching the visual signature of a location against images of locations the user has designated as off-limits, such as a bedroom, bathroom or boardroom. The app flags such images for manual review before they are passed on to associated apps, such as photo storage or sharing. The idea, say PlaceAvoider’s creators, is to both help device owners protect their own privacy and thwart visual malware, such as trojans that may be looking for precisely this material.
This got me thinking. The street, so the quote goes, finds its own uses for things. What if we are the malware in a different scenario? In other words, such apps, once they exist, can and will be used at some point as a form as “diminished reality” app, where the user is not the administrator.
What if we are the malware?
We know that applications such as Google Maps, Google Earth and StreetView already acquiesce to regulations that require obscuration of government installations, private companies’ facilities in some cases, some brands, and private citizens faces and number plates—even as it works hard to decipher items like house numbers. In other words, technology is used to differentiate what we can see and not see, depending on the legal or ethical (or otherwise) standards of a particular place.
For the most part, Google Maps, Google Earth and StreetView are forms of augmented reality. They digitally render reality with forms of markup, of contextual data, which adds to our perception of places. Except in the cases of blur, pixelation and, it could be argued, accidental presentation of various kinds of render ghosts—people and things only partly captured or partly presented, artifacts of digital accident or persistent memory. Some kind of determination is made that there are things present in reality that we can’t or shouldn’t see.
Given this creeping designation of digital artifacts of the real world as protected, only viewable to those of particular value or privilege, it seems reasonable to expect that at some point in the near future devices such as Glass, or other heads-up displays, such as digital windscreens, will contain apps that prevent us from viewing, getting data on, and/or capturing images of people and places in public. We will, in effect, be wearing a form of subjective map on our faces, or driving through selectively rendered environments. We will be using increasingly sophisticated forms ofdiminished reality technology.
Black sites for the digital world, both opt-in and enforced.
Such applications could be voluntary—preset based on your needs or selectively filtered. Are there people or locations you don’t want to see? The homeless, protesters, or an ex-boyfriend? Consider them gone. Don’t want to notice the cupcake bakery, or that chip shop on the corner? Not to worry. Sick of the billboards on the highway? Blocked.
Or they might be imposed upon you. Not of sufficient wealth, ambition or sophistication to look upon the rich as they stroll by, or watch the upwardly mobile board their special buses to work? Gaze redirected. Didn’t pay to enjoy the vista that a new museum or waterfront development presents? Speak “yes” to donate and unlock this beautiful view. Not at a sufficient pay grade to look into offices on the window side of the 15th floor? It’s all a haze. Have a record of illegal peeping? Those kids on the playground are now pixels.
All of these could be achieved with the similar technical approaches as those employed in PlaceAdvisor now, or those used in current retail facial recognition software. Black sites for the digital world, both opt-in andenforced.
In an increasingly digitally mediated world, the map is the terrain. The map, it’s content, and its parameters of use constitute an instruction set. Functional constraints set by location and content of that place can become a type of physical DRM, with your mediated view locked out where you lack adequate “rights”. You may find yourself one day having to say yes to an End User License Agreement (EULA) as you walk outside your own home, or boot the self-driving car to go to work. You may subscribe (or be subscribed) to seeing as a service. Good luck out there.