Image: Editing out a traditional green screen in favour of a digitally monochromatic version
DaDa Holograms is a project to look at the potential use cases for augmented reality when it comes to audience access for live and digital theatre, specifically focusing on BSL interpretation. Read on to find out more from DaDa's Digital Producer, Joe Strickland.
Co-funded by The Space, this project will examine three different use cases for this technology, and draws conclusions about the effectiveness of each use case in being a tool for audience access, as well as how simple and affordable each use case might be for artists and organisations to implement.
Phase 1 of the project looks at the effectiveness of a handheld, screen based, augmented reality experience where an audience member can watch the BSL interpretation of a live production on their phone while also being able to see through the device to whatever performance is happening in front of them. The intention of this design was to allow for a portable and affordable solution for increasing the accessibility of immersive and promenade theatre productions for Deaf audiences.
In order to create this application we have to overcome two main design hurdles:
How to get the interpretation onto people’s phones
How to allow the world to be seen through the phone
Hypothetically, we could have made an application that was pre-loaded onto a device that an audience member could be given prior to the start of the production, but this would have meant creating a more limited use case from the get go, which is something we wanted to avoid in order for the R&D to be the most useful project that it could be. Being able to have the application work on any device would be a much more useful outcome for the project, so we aimed our sights on making this a reality.
This was harder than expected, but maybe not for the reasons you might think. Making the application itself was not difficult. We used Unity, a game development software that can export apps in a number of different formats, to build our experience. We had a BSL interpreter record the interpretation for the test experience (a chapter from Chronic Insanity’s digital promenade production, All The King’s Men) against a green screen. This green screen was then digitally removed and replaced by a digitally monochromatic green screen in post production. This is because a real life green screen unless lit perfectly, will still have a range of different hues of green in it and, although a video editing application can edit this out with ease, it might have posed a problem for our BSL application. By editing out the real life green screen, and replacing it with a digital green screen which was entirely made up of a single colour value of green we help to avoid any issues with editing out the green screen at a later stage.
This edited footage was then hosted online and the link to it pasted into our application and positioned to play in the lower right hand corner of the screen. When our application applied its algorithm to remove the green screen it now did so perfectly, leaving just the interpreter in the frame.
Then we wrote a bit of code to gain access to the back camera of the audience’s phone, and portioned this camera feed behind the interpreter. This allowed us to effectively turn the phone see-through, with the audience being able to perfectly see their surroundings, with the BSL interpreter overlaid on top.
Watch a video here illustrating the application:
The hard part was uploading it somewhere where every audience was able to use it. We had decided to create a web app as this should have been available to use for any audience member regardless of their choice of smartphone. However, Apple do not support the WebGL platform used to export our app. There is no official reason for this, but by not supporting it they all but ensure that app developers have to make apps for their AppStore rather than for an internet accessible by every device. This left our app functional on most Android devices, but incompatible with iPhones and iPads. We are currently pursuing making it available on these devices via the AppStore, especially since iPhones make up about half the smartphone market in the UK, and at least a quarter worldwide.
This phase of the R&D ended with a working prototype of the experience that allowed Deaf audiences to take part in the digital promenade performance. The original performance was entirely audio based, asking audiences to listen to instructions via their headphones and go out into the world to perform tasks. This prototype version not only allows Deaf audiences to take part in this section of the experience but preserves this sense of thrill and secrecy. Being able to see through their phones allows Deaf audiences to pretend to be looking at their phones while they actually receive the instructions in an equivalently covert way to the audio in this section of the experience.
This goes to show that access isn’t just about the material or content of the production becoming more accessible, but about building access into a production in a way that the same feeling can be felt, or the same experience can be had, regardless of the audience wanting to have that experience.
--
Want to find out more?
If anybody would like further information about the results of this R&D process, or would like to help try out the augmented reality experiences when they are ready for audiences, please get in touch with DaDa’s digital producer by emailing digital@dadafest.co.uk