OK OK, first things first, I know what you’re thinking and I admit I had some doubts at first but never one to turn down an opportunity to talk about and demo VR to people, I was intrigued as to how it would work; blind people in VR!? That sounds like a crazy stupid idea, right? Wrong…
Visits to the members centre
Twice in 2018 I was invited to visit the Blind Veterans UK centre just outside of Brighton, UK. The Blind Veterans UK is a British charity, providing free support and services to vision-impaired ex-Armed Forces and National Service personnel. Many of the members are elderly, with most being age 50+ (I am approximating here). I visited as part of their technology week events, a showcase of current and emerging technology available to help, assist and enhance lives; also as many of them have a technical background, they are generally interested in all things tech and advances made.
Both times I visited, I gave a talk about VR (what is it, how it works, terminology, history of and some use cases) followed by a demo / try session. The first time I took a VR-Ready laptop PC with standard HTC Vive setup. The second visit involved a desktop PC and HTC Vive Pro with the wireless adapter. After the amazing impact of the first session, I was keen to try out a system with greater freedom, fewer health & safety risks and overall higher quality optics.
Whilst I gave my talk, a few members dozed in the afternoon sunshine (but am used to this having given many talks at schools and colleges) but when it came to the demo / try sessions, I was faced with a keen audience to have a turn. At first there was some trepidation, with members egging each other on to be the first to try it out; I think there was an air of disbelief and uncertainty around how registered blind and partially sighted users would get anything from the experience — I certainly was unsure of the outcome.
The first willing member was assisted over to the chair and guided into the seat, before donning the headset. The impact was immediate, with gasps and coos as they were suddenly taken from a world of blurry colours and vague shapes into the depths of the sea surrounded by fish and ultimately a whale. (VR fam will know that this is WeVR’s The Blu whale experience, chosen for impact and wow factor but also relative shortness of length and appropriate use standing or seated.)
Once the experience was over, I carefully removed the headset and asked the member to explain what they had seen and what it felt like. It was a humbling moment, having worked with VR for years and done thousands of demos at events, to see a genuine, positive impact upon someone. They were clearly astounded at having been able to see clearly, even the small fish rendered to swim around the diVR and the sense of scale of the whale blew them away.
That was all it took for the others to start jostling for positive for their turn; baring in mind many of the members had suffered sight afflictions for decades, suddenly the opportunity to see something clearly was too good a chance to miss.
For the next hour or so I put member after member through The Blu whale VR experience, each and every time with astounding results and impact. From ex-service people who had worked on flight simulators, to those who had lost vision due to proximity of grenades exploding, even those with the severest blindness were able to experience a sense of presence, thanks to the immersive audio of being underwater, manta rays flipping past and the audible weight and bulk of the whale coming up to the shipwreck.
Whilst a couple of members were happy to sit out the demo, I think in case they didn’t get the same impact as others were having to avoid disappointment, it was generally agreed that the VR technology allowed them to see clearly multiple factors more than they could in the real world.
One gentleman was unable to focus at distance due to convergence issues in the real world but once was inside the headset, marvelled at how he was able to see with both eyes at the same time without having to close one to reduce the conflict normally experienced.
I think this comes down to the current way the tech works where users focus at infinity through the screen, rather than at the point of the screen plane as with traditional TV screens. Also as we currently do not have foveated rendering or reliable eye-tracking providing depth cues and focus contrasts between attention points, everything 3D is shown at the same focal depth. As these technologies advance, I hope the impact and affect for this gentleman won’t be diminished as we move towards more realistic, real world vision and display.
Shortcomings and second visit
By the end of the first visit, many in the room were quite emotional and moved by the experience of VR and the technology, myself included. I was keen to see how removing some of the barriers and limitations could improve the experience next time. So when asked if I would return later in the year, I readily agreed to do so, knowing we now had the HTC Vive Pro and the wireless adapter, offering higher resolution display, without all the cables and trip hazards. Although we soon switched to the Deluxe Audio Strap at the studio after the first visit, the standard Vive setup requiring additional headphones to be placed on the user added more mess and tangled cables for demos, so was also happy to have the built-in headphones with the Vive Pro.
Because of the cable, the first visit had mostly seen members seated to experience the VR but with the freedom of the wireless headset offered, the second visit was a much more active affair. Seeing members, who in the real world had to be guided and assisted in order to sit, walk, avoid objects suddenly hopping up and chasing fish around a virtual shipwreck on their own was equally astounding as it was terrifying. I’ve seen many kids faceplant in VR but they’re made of rubber and bounce back — the last thing I wanted was a VR immersion presence lean-related fall and serious injury of an elderly member. So in turn, I’d do a non-contact dance around the VR space with the member as they moved around, testing their limits and bounds of confidence, arms outstretched ready to catch them should they stagger or fall.
Soon it was time for tea break and I was suddenly left alone for 15 minutes, playing second fiddle to the lure of a hot cuppa and a biscuit, able to reflect for a moment in peace about the experience and impact upon the members. Many came back for another turn before I had to pack up and return to the studio so I thought I’d let the more mobile, creative members try out Google TiltBrush.
One member was keen to have a go, having been an artist and keen to allow some creativity to flow. However he was only able to walk assisted with two sticks and as he slowly ambled up to the VR space, I had concerns over how he was going to cope having to lose the sticks to hold the Vive wands. But one stick was happily thrown aside and the paintbrush wand grasped, as he slowly but surely began his creation. Like many first time TiltBrush users, he started off painting flat on one plane until realising he was able to walk around and through his virtual floating brush strokes. With some assistance changing brushes and colours, soon a near perfect replication of Mickey Mouse was floating in the air above what became a garden full of brightly coloured flowers. With a signature flourish, the gentleman was ready to sit down again, happy with his creation and ability to have scratched the art itch, and that was the end of my time at the members centre.
Thoughts about the tech
So whilst I know a fair bit about VR tech and how it works, I don’t know much about the science of sight and may describe things using the wrong terms but I think the reason the members were able to see, simply put, was down to a couple of factors:
- The proximity of the screen combined with the full immersion focus on to it ensured that any light or sight was centred on the VR content
- The magnification of the lenses to provide the field of view assisted with vision and clarity
- Focussing on the screen at infinity, or looking through it rather than focusing closely at it mere centimetres from the face, removed depth focus convergence conflicts for many members where one eye was dominant or less sighted than the other
- Narrower field of view removed peripheral distractions, light bleed or sight-related afflictions; however am aware some members had sight issues related specifically to central viewing area i.e. cataracts or glaucoma but were still able to see enough at the side clearly.
- Immersive, positional audio attached to objects within the space allowed even those without any sight to feel immersed through the soundscape created as they moved their head around, creating a sense of presence of being underwater or near a whopping great whale.
So if people can see in VR better than in real life, how can this fact* be used to benefit them in the day to day? One thought is the use of pass-through camera feeds to the VR headset, allowing them to have a magnified version of the real world and their surroundings, combined with AI and machine learning to recognise and highlight objects that could cause trouble or need to be avoided. There’s no reason why AR headsets couldn’t do this as well although I think specifically the VR technology and points above make VR more suitable than AR as a whole.
The current limitations of pass-through however mean that it needs a few more iterations before it could be relied upon and used safely. Whilst the members typically do not walk at pace, the latency lag within the display from what the camera is processing and signal sent to the screen means there is an element of delayed timing with real world movement and displayed images. Also pass-through tends to be monotone so whilst potentially useful for movement and everyday tasks like reading bills or books or sorting money, it makes for a colourless world.
Of course there’s also the issue of current size of the VR headsets, ease of getting them on, setup and operational that makes them wholly unsuitable for elderly users without assistance in general.
One thing that they could be used for right now is for viewing TV and films; VR virtual cinema applications exist already with a wealth of content and live streaming options. Each member could don an Oculus Go, choose their channel or content and watch on a giant screen with noise-cancelling headphone in their own private cinema, reducing overall noise for other members (many have hearing impairments as well based upon age) and staff alike. I’ve already spoken to Darshan from Bigscreen about how this VR application could be utilised, so if anyone knows anyone at Oculus who would be willing to donate Oculus Go headsets to the charity, get in touch!
Final thoughts and conclusion
Many people I’ve spoken to about taking VR to Blind Veterans UK have scoffed at the idea initially, thinking that blind means blind and totally unable to see, so what’s the point? There are many degrees of blindness and vision impairment and yes whilst a couple, out of about 30 members in total, couldn’t see a sausage in VR, there were still environmental audio cues and a sense of presence gained from the experience.
As a strong believer and advocate for VR4Good // VR4Impact, it was truly the most moving, emotional experience I’ve had, putting blind people into VR and seeing the impact the technology could have upon their lives. I hope to continue the visits and working with the charity to understand more how it works and how it can be incorporated more widely.
So if you have an elderly relative with vision impairment, don’t forget to include them next time you are demoing VR to your family at a gathering.
*un-scientific, qualitative fact based upon my limited “test” group of subjects.
Photos used with permission from Blind Veterans UK