Through the Looking Glass

Jonas Hielscher’s research focused on the visual aspects of augmented reality. Two demos were produced in the context of this residency, Last Pawn and Human Sandbox.

The point of departure for his investigation was Lewis Carroll’s Through the Looking Glass and What Alice Found There. The work began with the idea of stepping through Lewis Carroll’s mirror and discovering the world behind. The story served as a visual and thematic way into researching how users can be visually misled or tricked within existing technological systems. The findings of the investigation were then used to give shape to the narrative and gameplay of the demos.


Two demos were produced in the context of this residency, Last Pawn and Human Sandbox

Last Pawn

Last Pawn compilation view

Last Pawn is a collection of visual experiments done during the residency. Moving around in the real world, the player can select various virtual windows by moving a pawn on a chessboard. The pawn controls a virtual character, and this character can open different windows. After opening a window, the player can walk over and step through it. Inside this world is one of the studies into the visual aspects of augmented reality. The player can walk around in the world, interact with it, and finally leave it. He or she then returns to the initial situation and can use the pawn to choose another window. The real space the player moves around in is darkened and carefully lit in places. Selecting, opening and entering the different windows influences the lighting in the real room. The light changes guide and accompany the player through the various phases of the game.

Human Sandbox

Human Sandbox

The Human Sandbox is a further elaboration of the pawn-and-chessboard interaction model. In this demo, several pawns sit on a chessboard and control various virtual characters. The demo requires two players. One player uses the pawns to control the virtual characters. He can add characters to the virtual world, delete them, and move them freely through the world. The other player moves among these characters in mixed reality. Every virtual character has its own sound and reacts in its own way when the player gets near it. The demo is an environment that makes it easy to play in mixed reality; it has no specific game goal.


Along with the two demos, a number of extensions for VGE rendering the platform more accessible were also made during the residency. The Blender exporter has been expanded, so that materials with multiple textures, real-time reflections and normal maps can now be exported from Blender. Multiple scenes can also now be made in Blender and separately exported and loaded in VGE.


VGE was also linked to the TUIO communication protocol. This is a universal communication protocol used by, for example, reacTIVision software. It allows the use of marker tracking and smart surfaces in our mixed reality system. Development was also done to enable the control of DMX panels from VGE. This allows dimmers and other electrical devices to be controlled with our software.


Document Actions
Document Actions
Personal tools
Log in