Image: Illustrates the three different phases of DaDa Holograms
DaDa Holograms is a project looking at the potential use cases for augmented reality (AR) when it comes to audience access for live and digital theatre, specifically focusing on BSL interpretation. Read on to find out more from DaDa's Digital Producer, Joe Strickland:
Co-funded by The Space, this project will examine three different use cases for this technology, and draws conclusions about the effectiveness of each use case in being a tool for audience access, as well as how simple and affordable each use case might be for artists and organisations to implement.
Our use cases were:
- Immersive or promenade theatre performances
- On demand and digital theatre
- Live and in-person theatre
We approached these use cases in the following ways - clicking on each example will bring up their relevant blog post with more details:
- Phone-based AR interpretation for portable performances or exhibitions:
DaDa Holograms: Phase 1
- Desktop based AR for digitally inserting interpreters or Deaf performers into performances or the rooms of the audience:
DaDa Holograms: Phase 2
- Live AR projection of a Deaf performer into an in-person theatre production:
DaDa Holograms: Phase 3
A video showing off the behind the scenes workings of these use cases can be seen below:
As a result of this project we have developed some guidelines for artists and organisations moving forward if they want to continue our work in experimenting with these technologies. We set out to research and develop the hologram BSL concept with affordability and simplicity in mind, meaning that the end results we’ve achieved should be widely achievable regardless of budget or technological skill.
For artists:
- Integrate access into the very beginning of your work. Try to avoid tacking it on at the last minute or in the final stages because audiences that need that access will be able to tell that it has been tacked on and they won’t have an equivalent experience to the rest of the audience.
- The hardware we used was easy to acquire and relatively cheap compared to the six figure sums recording volumetric, or 3D, video can cost. We used Azure Kinect cameras, which can be in high demand and pricey at the moment, but you get very similar results with a Kinect V2, which is much more affordable, easily less than £100 second hand. The Intel Realsense camera was tested but was regarded as not high enough quality for our desired uses.
- The software we used is also affordable, Depthkit runs on a monthly payment of about £30 and it takes the image data from any of the above cameras and turns it into a 3D video object without you having to do any coding or calibrating yourself. We also used Holocap, which does something similar but only seems to work with the older Kinect v2 cameras. Holocap creates a cleaner boundary around the performer, but has a lower resolution than the Depthkit/Azure Kinect combination. We’d recommend either of these combinations for making 3D video.
- However, you can go even simpler. Our first phase just used an interpreter in front of a green screen and there’s no reason that you’d have to go more complicated than this if your audience isn’t going to be moving around the 3D video object. If the technology of the 3D video appears intimidating, then recording an interpretation or Deaf Performer in front of a green screen and digitally placing them over, alongside, or within a performance can work just as well, if not appearing clearer and cleaner than some of the more technologically complex models above.
- Audiences, including Deaf audiences, really appreciated that this near future technology was being developed with access in mind from the get go, as making something accessible normally comes as an after thought to innovation. Take this good will and run with it, continue to develop ideas and applications for the BSL holograms in full knowledge that there are audiences out there who will enjoy it and be appreciative of it.
- Finally, treat the work DaDa has done here as the mining of raw material. Material for you to then turn into something bigger and better. This project went from an idea to three rough and ready prototypes that we’re hopeful will inspire artists and organisations to take our ideas and expand upon them in the future.
For organisations:
- Most runs of productions only have limited numbers of accessible performances and the time has come to expand the access that you offer audiences so disabled, Deaf, and neurodivergent audiences don’t have artificially limited options for seeing the work you facilitate. This project shows the exciting and creative access that can be built into the core of a production, and if it’s in the core of the show, let alone if it's affordable and easy to use, then it can be present for any audience at any time.
- However, don’t lose sight of another key element of this project; employing Deaf people to interpret for Deaf audiences. Every phase of this project centred a Deaf interpreter or performer. There are certain constraints when it comes to working with a Deaf interpreter, namely that interpreting live speech can be tricky if not impossible depending on the individual. In spite of this, phase three of our project successfully created an affordable and easy to use AR-projected Deaf performance to present next to or within a live theatre show. This allows Deaf interpreters to take on a job that has been kept away from them beforehand, and with a live operator subtly adjusting the speed of the recorded interpretation there are zero issues with using a pre-recorded interpretation in a live context when it comes to synchronisation.
- Also, who says that the Deaf performer has to be the hologram? Have a live Deaf performer and a recorded spoken performance to project alongside them. Have the recorded Deaf performer play and interpret for multiple characters by changing costume or position on stage. Have multiple versions of the same Deaf performer appear across the stage at the same time. If this project has created a tool then feel free to encourage your artists to get creative with it and expand beyond the usual one interpreter in one location that we might have been constrained to in the past with an in-person interpreter.
- Deaf audiences fed back positively about the use of this technology for access. In spite of some of the rougher edges of the 3D video capture, the interpretation was still inteligible and performers' hand movements, body language, and facial expressions were preserved so as to communicate with clarity. If the clearest execution of an interpretation is what is desired for your audiences then a green screen-captured regular video interpretation provides perfect clarity and can be used instead of a 3D video in most circumstances.
- Another idea that came up in our audience feedback was that international tours of work using this technology could have different recorded sign language interpretations for the different countries they might visit, given the variety of different signed languages that exist in the English-speaking world, let alone further afield. Access isn’t just about ability, it’s also about resources, technological literacy, and geography. We’ve tried to keep the cost low and the user experience simple, feel free to overcome geography yourselves!
- One of the hardest parts of the project was building functioning apps for the prototypes to run in so that they could be tested remotely by audiences who weren’t physically present. We very much recommend coming up with a rough concept yourself and then having the finished app created by a professional app developer. Audiences have varying confidence when it comes to using new technology and you don’t want a clunky or confusing user experience to put people off using something that could be really helpful and enjoyable for them.
- Also, don’t use this project as an excuse to employ people less or to automate interpretation work. If you have the budget, or the capacity to raise the budget, to employ Deaf interpreters or performers for extended periods of time then you must absolutely do that. However, if you are legitimately strapped for cash but have an audience you want your work to be accessible to then this project might have some of the answers for you. Likewise, if you have a creative idea that can’t be reasonably achieved without using this tech, feel free to make it happen with our ideas. The very core of this project has been to develop practice about Deaf arts workers, but never without Deaf arts workers. Don’t ruin that.
This project has been an exploration. We have found three valuable use cases for augmented reality BSL interpretation, and have achieved functional prototypes to address each use case that we believe to be as affordable and simple to implement as possible.
We challenge you all reading this to take these ideas, these thoughts, these examples and continue building upon them. We challenge you to not just consider AR technology, but any technology present in our lives or on the horizon, and figure out how that tech can make your work more accessible to a wide range of different artists and audiences. You might already do a lot of accessible work, but we challenge you to do even more, and the tools from this project should hopefully encourage you that this extra work will be worthwhile.
If anyone has any further questions or points of discussion then feel free to get in touch with us at digital@dadafest.co.uk.