Aurora: Bringing the Northern Lights to London
In late 2020, Apple released the long-awaited iPhone 12, which features a laser sensor that enriches depth scanning for better quality photos and more compelling augmented reality. We at Animorph are fortified by this addition of lidar to such a mainstream series of handsets, as we have been using depth sensing mobile AR since 2015. Have we been on an ultra secretive developer programme, locked in the depths of the new Apple Park? Thankfully not, it’s simply that lidar technology, even in smart phones, is not that new at all!
Lidar is a type of Time-of-Flight (ToF) camera, it stands for ‘laser imaging, detection, and ranging’, and has been developing from its original military uses since the 1960s. Lidar applies lasers to ping off objects and return to the source of the laser, measuring the distance by timing the travel — or flight — of the light pulse.
Released in early 2017, the Asus Zenfone AR delivered a 3D (ToF) camera for 3-dimensional perception of its surroundings in real-time, enabled by the REAL3 image sensor chip and open-source Tango framework. Later that year, Animorph developed a wearable AR application for Zenfone and Holokit, as part of our installation at the Southbank Centre in London — the third of our experiments at the public art venue.
The first event was in February 2016, when Animorph publicly presented its first Research and Development prototype at the Southbank Centre’s Roof Terrace. The Mixed Reality app traversed wearable AR and entirely detached VR environments where participants’ choices shaped their narrative experience.
How to Survive in 2050, Research and Development, February 2016
‘How to Survive in 2050’ was the fruit of the first research and development project with the Southbank Centre. Having recently embarked on developing XR experiences, the concept for the piece looked far into multiple futures: ‘A journey through a dystopian scenario of the future as a part of a modular platform for various takes on where future might take us’.
During this 15-minute narrative-driven mixed reality (AR/VR) experience, the participants traveled the space wearing a headset containing a smart phone. The phone’s view was layered with virtual reality, and headphones with spatial sound completed the immersive experience. Sizable markers in the physical space pulled participants into different paths of the narrative, and a third-party laser tracking station allowed for six degrees of freedom within a defined area. At the time we only dreamt of having a laser scanner as a part of our head-mounted display. The curators from Southbank expressed their interest in developing the conversation and focusing our wide interests into a more contained idea. It was the first exploratory step towards developing what later became ‘Aurora’.
Aurora, Research and Development, November 2016
Over the next months, a conversation with Southbank Centre continued, culminating in the second R&D later that year. Animorph delivered another exploratory showcase in November, inspired by the aurora borealis, or Northern Lights.
The phenomenon of aurora is frequently interpreted as a sign of the supernatural; something off this planet. Physically it is ‘a result of collisions between gaseous particles in the Earth’s atmosphere with charged particles released from the sun’s atmosphere’. The visual display and its’ metaphor shaped the next prototype we developed.
This installation established a platform for free creation of variations on the Northern Lights, using a hacked version of Daydream headset and a controller embedded in a glove. The Research & Development for ‘Aurora’ suspended the participants between AR and VR, with participants first perceiving an aurora over their normal vision, and later entering a fully-immersive virtual reality environment when comfortably reclined on a chair.
The VR scene depicted the skies of the far North, and the frozen landscape spreading into the horizon. Participants viewed this through the hacked mixed reality Daydream headset and interacted with it through a motion controller affixed to a glove. The application read hand movement and fed data into the interface generating individually directed celestial trails across the virtual sky. The interface combined three aspects of hand movement and generated aurora according to three visual parameters: spread, colour and density. The experience within the headset lasted for 3–5 minutes.
Mixed reality technology enabled the participants not only to witness a custom aurora, but also to explore the capability of co-creating them. The Aurora prototype offered a glimpse into the unknown, a collision of natural phenomena with artistic interpretation in an immersive medium. While excited about the potential of the Daydream headset, we really wanted to invite participants to comfortably walk around the physical space with a headset on and witness the overlap the digital on top of the physical. Refining the technological approach, we also embarked on a deeper investigation of the associations behind the Northern Lights.
The Sky is On Fire, October 2017
In the final development stage, Animorph took a versatile stance to create a piece that is culturally embedded in aurora’s origin on the polar fringes of the world. Animorph grew interest in Sámi people when researching the histories of the Northern regions in Europe: we realised that Sámi are the only indigenous people still living on the continent, with an incredibly rich culture, and inspiring way of governing themselves. We were curious to connect with a Sámi artists, and the Southbank Centre was incredibly supportive in establishing a conversation and collaboration with the Sámi writer Sigbjørn Skåden. The process uncovered shared fascinations in myths as well as heartfelt discussions on history of Sámi people engulfed by Western culture. Our thought exchange was essential in imaging how exactly people would encounter an aurora on London skyline, and informed their original writing for the project.
A backbone of Animorph’s understanding of Sámi culture was the book “Sámi, People of the North” by Neil Kent of the Scott Polar Research Institute at University of Cambridge. The author kindly agreed for us to use a few excerpts to narrate a found-footage edit from 1940s of Sámi daily life and their environment. This was juxtaposed with a narrative message from Ellos Deatnu, a group standing for the self-determination of the Sámi people, which another Sámi poet, Niillas Holmberg, had shared with us.
The result of these collaborations was a video piece, forming an important grounding introduction to the augmented reality experience. The participants were invited to sit in front of the screen and tune in to the context, before they went onto the North-facing riverside terrace to dive into a poetic AR piece.
The final app we presented was much different than our earlier prototype of Aurora, using the then-newly-released Asus Zenfone AR with Tango and customised version of Holokit headset, which allowed participants to walk around freely, and have the imagery consistently generated at the right position and angle.
For simplicity of interaction, the participants were able to generate the Northern Lights on the London skyline by just moving their heads, rather than with the glove as previously. Whilst looking at historic landmarks of London, participants heard a poetic story, of which delivery was inspired by yoiking, a traditional art form of the Sámi, a combination of poetry, singing and meditation. Words subtly dissolved in the atmosphere, giving an impression of voices from the unseen realm.
The installation was presented in November 2017 as a part of the London Literature Festival themed ‘Nordic Matters and World on the Brink’. With the aid from cutting-edge immersive technology, it brought a poetic sensation of the Northern Lights to our metropolitan surroundings.
Augmented Reality can enhance the perception of the real environment with interactive audiovisual stimuli, incorporating all sorts of forms of expression, including poetry. Wearable AR becomes significantly more engaging when it is anchored in accurate measurements of space, and a realistic perspective.
Since Aurora, we have developed a range of spatially aware cross-platform projects for mobile devices and AR headsets. As we continue to work and thrive in this vein, we remember that our early research and development projects, that allowed us to get a taste of this freedom and share it with the public. We are glad that Apple shares an affinity on this front!