The urban AR lab was an experimental workspace at Stadttfinden Festival that took place in September 2017 at an old factory building called Glasfabrik located at the periphery of Leipzig.

As a team of four (with Jörn Röder and Lennert Raesch) we developed sensor kits that sent repeated measurements like carbon dioxide, air humidity, temperature and water values to smartphones in Virtual Reality (VR) headsets. By causing changes in the mobile device’s camera image we visualized the invisible and unvisited phenomenons of growing cities. We invited Stadttfinden visitors into our open laboratory and took them with us on what we called AR walks to re-see and reimagine the urban habitat in performative and experimental ways.

people on a walk wearing AR headsets

research questions

  1. If citizens perceive invisible or unvisited phenomenons of cities such as pollution and electrosmog levels audiovisually, do they feel more affected by them?


  1. A growing city is as much a place of fast developments as of unnoticed changes. Values like carbon dioxide, average noise levels and ground moisture are some of the indicators of urban transformation, so making them accessible can help citizens choose how and where they want to live.
Although cities like Leipzig often give access to some rough measurements of carbon dioxide on their website, there is no detailed grid or map that citizens could approach to find out more about their neighborhood, let alone an individual street or property.
In crisis situations citizens demand shared data, independent of governments and companies. Right after the earthquake that caused the Fukushima disaster in 2011, scientists and activists founded a group called Safecast and developed a cheap DIY device to register radiation near the nuclear factory and upload the data to an open map on the internet.
  1. We decided to set up bundles of sensors, each managed by a small controller called Particle Electron that sends its measurements to an online Cloud using a mobile phone number and the GSM Network. After soldering the sensors to the Electrons we had to write the software and uploaded it to them via USB cable. In the browser we followed the first values coming in.

soldering sensors to the Electrons

  1. Next, we thought about possible visualizations of the incoming information in the AR application we were to develop. Starting from the camera analysis technology we built in exp_04 - Item Vision and the performance optimization for smartphones we accomplished in exp_06 - Default Safari, we captured the camera images of our smartphones in the game engine Unity. Each sensor kit got a randomly generated name: ‘Elegant Ferret’, ‘Fancy Laser’, ‘Calm Doctor’ and the likes. Depending on what they measured we worked out a visual language of image modifications. For example, ‘Awesome Pizza’ measures and sends values for ultra-violet light, which our application translates into exaggerated colors on the device’s screen. At night, the low amount of ultra-violet light makes orange and red areas of the camera’s image stand out, while the measurements at noon result in glowing greens and blues. Volume distorts the screen, high carbon dioxide creates a pulsing noise in some areas.

sensor kits

  1. To compare changes in the measurements over the festival’s period of ten days, we decided to record data in different areas of the city district. There were spots that were about to be, or had just been, redeveloped by the city or investors, as well as sites that had been left decades ago and not touched ever since. During the AR walks we hid sensor kits inside the Glasfabrik, in front of an abandoned loading zone next to the main road, in a private garden, behind a bus stop and near a public swimming pool. While walking, we had access to all the kits at all times, so we would jump between the input feeds and put them in correlation.

hiding our sensor bundles

try or catch

At the beginning of the AR walks we introduced visitors to the idea of making invisible data tangible. Most visitors had never used VR headsets, so we made sure everyone adjusted their headset accordingly and would stop using it in case it made them feel ill. We took wool, stickers, and a mysterious suitcase with us.

We left our laboratory and headed into Glasfabrik’s main hall, where we had already placed sensor kit ‘Fancy Laser’ in a rainy, roofless corner right below an old chimney. It functioned as a sort of weather station, measuring air temperature and humidity, ultra-violet light, ground moisture and carbon dioxide. The other artists’ and performers’ works at the festival were instantly modified when we put on the VR headsets. Especially the temperature − it was warm and sunny nearing the end of afternoon − had distorted the colors of the factory: Even the walls, which were covered in graffiti, had given away all their blues to become bright green areas. Outside, we connected to the next sensor kit and started our trip alongside some old industrial facilities.

impression 1

While walking, we wanted to connect the mousy but important Serving area interfaces of internet providers in the nearby streets with red string to expose the last mile of underground internet cable. Interestingly, there was none of them close to the Glasfabrik. Of course, since most other buildings here were abandoned, why would anyone invest in modern internet structure? After a few hundred meters, we found the first box on a crossing leading into a residential area that looked more in shape. We pinned the beginning of the red string to it using some urban AR lab stickers.

Continuing our way, while the red string was carried, rolled, thrown or kicked along the sidewalks, the distance between subsequent boxes became smaller and smaller. Some passing pedestrians and car drivers were amused or bewildered by our doing. After two more turnings we suddenly ended up in front of a big, gray building with a radio tower in the backyard. It was enclosed by a metal fence. The post box revealed a tiny, washed out ‘Telekom’, the name of a major mobile phone contractor.

the red string is kicked by AR walk participants

We also carried a suitcase that screamed out the names of people’s home networks. Many people leave their device’s WiFi connection activated when on the go. The device in their pocket is still calling for its home, constantly trying to connect. The suitcase hears these cries and answers. We explained how operators of public spaces like shopping malls use this technique to map their customers’ movements and analyze their buying behaviour with Big Data algorithms. Of all the things we did on the AR walks, the suitcase was the most perplexing to passersby, probably because it was loud and invasive. Some of them were shaking their heads and called us crazy while others got interested and wanted to know more.

the screaming suitcase

Even though the images our headsets produced were weird, indecipherable or fantastic to the walkers, they got a sense of when things were changing. Values like carbon dioxide and air temperature differed in different streets, resulting in varying visual outcomes.

impression 1

impression 2

impression 3


Urban transformation was not always happening where we assumed. We followed the internet distribution boxes through a park, expecting low carbon dioxide values. To our confusion they were higher than in the residential area we just left. Maybe it was due to the pub, the noisy bouncy castle or the Telekom company building with the big tower just next to the garden allotments around the next corner? The AR walks only visualized the symptoms, not the cause.

Still, since it was so simple and straightforward − by just strapping on a headset − we gladly adapted to our newly gained senses. Optical cues like the dark noise coming from the carbon dioxide emission were easy to recognize for the walkers, generating discussions and wild guesses on their source. At some point they got so excited about the distortions that were caused by the volume sensor that they all started to scream to see how far they could go together. Even after the walk, some of them stayed to discuss what else should be visible.

Within this experiment, despite on a small scale, making the invisible noticeable worked out surprisingly well. In the future, we plan to develop a sensor kit that is as affordable and painless to use as possible. We want to make customized and as yet unthinkable visions available to all citizens, encouraging them to make their own city a better place.