Insect Vision is an attempt to simulate the view of the wasp Megaphragma mymaripenne, one of the smallest insects known to humans. It has compound eyes consisting of 29 facets, which allow it to see in all directions at once. Humans are unable to observe more than 100° (vertically) and 200° (horizontally) without turning.

We suggest that a ‘point of views’ (facets) instead of a singular point of view (bipolar view) could be a helpful tool in the field of political activism. Using Virtual Reality (VR) we simulate a faceted view and mount it to a human’s head.

visitor interacting with the installation

research questions

  1. Is a fusion of cognitive strategies of human, machine and insect possible in current VR technology?

  2. What happens if humans are exposed to multiple perspectives simultaneously?


  1. We set up a virtual space in the game engine Unity and used the HTC Vive headset to look around.
  1. Instead of one, we instantiated multiple viewports that collectively created a panoramic view of 360°.
Multiple viewports are usually used for local multiplayer games and split the screen into smaller areas, allowing each player to see the information important to them. This mechanism is also called Split Screen.
Technically, each viewport generates the virtual scene itself: Instantiating multiple viewports is not like setting up multiple cameras that film the same live space, but like setting up multiple cameras that each create an identical space and film it. We set up eight viewports side by side in Unity. To our confusion, the viewports transposed weirdly and rotated themselves, whenever we moved physically with the HTC Vive on our heads. Imagine your eight eyes moving away from your head while you move forward – effectively growing your head in all directions – and you will get an idea of the motion sickness this state of the experiment caused.
multiple viewports in Unity
  1. To fixate the viewports properly, we had to recalculate their correct transformation corresponding to Unity’s automated values every visible frame, which is about 90 times a second. Undoubtedly, the game engine was not laid out to support multiple virtual cameras to be used by a single user.
viewports behaving wierdly when user is in motion
  1. We created a new virtual environment using strong colors and a simple, blocky layout. The design of the space was supposed to make navigating with multiple viewports easier. For optimal performance, we baked all light and color information into a single texture that is wrapped over the 3D model.
baked texture
  1. A significant obstacle we encountered was the impossibility to implement more than 28 eyes using Unity. The graphics card allowed no more, so finding a way around this un-feature would have been very difficult. Yet, we managed to at least set up these 28 eyes, thus having 14 stereoscopic, three-dimensional facets.
virtual testspace the virtual testspace seen through hexagons

try or catch

We invited others to try the Insect Vision in our workspace. After staying in our simulation for about ten minutes, one of them took off the headset and told us she felt confused and could only focus on one of the facets at a time instead of using all of them concurrently. However, after ten more minutes she managed to understand the composition of the virtual landscape. She said she enjoyed looking behind herself through many eyes without having to turn her torso and head.

Another test person used Insect Vision for about 30 minutes and later became aware of her own limited field of view: Without wearing the headset her nose appeared to be a dark spot in the middle of her face, that otherwise remained unnoticed by her. She also observed that her eyes are embedded in her skull which causes a vignette in her vision.


After the tests, it became clear that the issues preventing us from answering our research questions were of technical as well as psychological nature.

Technical in that the software (Unity) and the hardware (graphics card) both did not support our multi-view vision. Certain fields of experimentation had not been thought of by their developers. The only way of achieving anything was to work against the default settings.

Psychological, because it was hard for the testers and us to focus on multiple facets at the same time. Typically, humans see ‘either or’, never ‘all’. We call the effort of simultaneously looking at multiple spots ‘unfocusing’. It means changing from a single angle of interest to a sea of perspectives – to take in the whole picture.

When reaching a state of complete ‘unfocus’, the amount of visual input is so overwhelming that one is tempted to snap back into single-focused view after a few moments. Insect Vision made us more aware of the limitations of human vision: The ability to see behind oneself without turning can be accomplished with technology; still, processing many perspectives at once is exhausting. A long-term study assessing ‘unfocusing’ and its influences on perception is a project for the future. At some point, humans might be able to adapt to an unfocused, multi-perspective vision. The Oxford English Dictionary reads on adaption:

  1. The application of something to a particular end or purpose; the action of applying one thing to another or of bringing two things together so as to effect a change in the nature of the objects. […]

We hope that the concept of a ‘point of views’ is not limited to this experiment and can influence realms other than art.