Motion Vision is a mobile app that captures the surrounding area by camera and highlights movement in real time. It can be used with or without an Augmented Reality (AR) headset.
How do non-human animals distinct movement from stillness and can humans learn from that?
Modern cities are referred to as fast moving. If we erase everything from the field of view that is not moving, how much of the city is still visible?
We went out and tested Motion Vision in the city of Kassel.
When standing still in the streets we saw moving cars, tramways, bikes and persons passing by. Looking up, we watched the movements of birds. Infrastructures disappeared as much as slow moving clouds, dancing leaves in the wind and still standing or slowly moving people or animals. When moving or turning our head these static objects became visible for the time of our movement.
When doing the experiment with groups of people, those without the headsets moved quickly to be seen by those having the augmented view. Some people told us they felt like in a movie or simulation.
We also looked at animation, the illusion of movement. The cicada in Disney’s animation classic The Grasshopper and the Ants is almost completely in motion and when dancing, only their left foot becomes invisible for a few frames.
Motion Vision changes the way one moves and sees movement. That alone is an interesting experience. Likely, cats see their surroundings very differently than imagined by us, but their special focus on movement becomes more relatable for humans using Motion Vision. The app was created with a human agency in mind: To re-see and reevaluate the city as a system constantly in motion.
As discussed above, the cameras of modern smartphones like the Galaxy S7 have, compared to the human eye, a very limited field of view and a low resolution. Their field of view is even further away from the cat’s ability to see. On the other hand the camera allows better vision at night than the human eye and consequently makes Motion Vision as suitable for night as for day.
Words we use in this text like live image of a digital camera suggest that a camera shows its surroundings as if the camera was a hole in the smartphone through which an application can see. Actually, a smartphone really constantly generates this live image on a chip. The process of displaying generated, successive images in the camera app happens at a speed that has probably been adopted from film making and animation. Since the imagery of the app is visually linked to hand drawn animation, people thought that what they saw when using the app was clearly curated and framed in a way that an animated motion picture is. Despite that, they realized this curated, framed view draws its imagery from their surroundings and that their movement has influence on what they see. This led to the idea that the way humans draw information from their surroundings without Motion Vision is also already curated and framed.
Motion Vision can be understood as a first attempt to learn from non-human animals as highly specialized inhabitants of cities. Derrida acknowledges the cat’s gaze, but still only mentions the movement of his thoughts regarding this gaze. Another approach is to think the absence of language as something other than not interpretable, but to regard seeing and perceiving as thought or equal to thought. When opening up to animal inspired modes of perception we hope to blur the boundaries between humans and other animals that have been historically constructed. We suggest to rethink thought as a practice of seeing and perceiving, thus leading to our approach to re-see sight.