9 Years of VR — Ruminations & Snippets (full)

Sam Watts
53 min readOct 9, 2022

--

tl;dr — a series of posts about things that have happened to me since 2013 in VR.

Settle in folks, this could be a long one. As I sit to start to piece this together, it’s mid-July 2022, the weekend after the week I was offered my soon-to-be new role as Content Partnerships Lead at HTC Vive / Viveport, and subsequently handed my letter of resignation in as Immersive Partnerships Director, at Make Real Ltd.

I‘ve ended up being a visible public face of the studio for 8 of these last 9 years, which has sometimes led to people hilariously confusing me as the CEO, MD, founder etc. Whilst outspoken and passionate about the direction and strategy for the studio, this always has been and will continue to be Robin (MD) and Ben’s roles (co-founders) as I move on.

Anyway with that cleared up, a 3-month notice period awaits that will take me up to October. I’ve plenty of time to put some thoughts into words here, covering some of the [sharable] experiences and times had since I joined the company back in May 2013. Plenty of public posts, thoughts and opinions have been made online already over the years, the full collection of which you can read over on my pinned post “Adventures in VR Babysitting” here on Medium.

First I guess we should look at a little bit of backstory to help position where all this fits in. I first tried VR in the 90s, at the Trocadero Centre in London where they had Virtuality machines running Dactal Nightmare. For those who have tried it will know it was, as expected from the computing power available back then, low-poly, un-textured, low-res, high-latency and pretty headache-inducing. Like most people back then, I was impressed by what VR could be but knew it had a long way to go until it would get there, and mostly forgot about it. I studied CAVEs and VRML at uni in the late 90s, watched The Matrix, read Snowcrash and followed relevant tech around gaming and immersion where I could.

I’m not a programmer, I dabbled with design on and off around a couple of previous roles but my core strengths were within QA and project management. I’ve tried learning to code on a number of occasions but I just don’t get on with it. So when 2013 turned up with the Oculus Rift DK1, for me it was a rebirth of opportunity for VR but with a mature mindset. Over the following years, this meant doing what I could do best to help others understand and make them believe through honest critique and awareness, whilst helping building teams of highly skilled people who could make stuff.

This last point is key moving forwards. The industry around VR and other immersive tech has become heavily influencer-driven and we sometimes forget the pioneers who over the previous waves, have gotten us to this point. VR has become a hype-focussed industry, certainly around the rebirth in 2013–2014 leading up to the commercial launches in 2016. There were many pivots by various companies hoping to cash in a quick buck on the latest buzzword or hashtag. Periodically there’s a culling of the dead wood, or some of those companies go back to what they were doing beforehand.

But with each new term or cycle of tech, from the initial swell to standalone and most recently, around the metaverse, new faces jump on the bandwagon or adjust their job titles to remain relevant as an influencer or paid voice. Look to find the builders behind the speakers and pay attention to what they’re doing rather than those who just shill others work for their own credit.

9 years of various VR and AR headsets (x50) I have had the pleasure of trying out

So over those 9+ years we’ve seen multiple iterations of headsets released, some more successfully than others, hardware companies come, and go or never deliver, acquisitions for better or worse and multiple Years of VR heralded as success or failure, depending upon the pundit. But as with every story, we must start at…

The [Re]Beginning

Through the mists of time, there’s one event that sometimes gets mis-credited to me. This event was pretty pinnacle to what the studio became and whilst I was responsible for taking the Oculus Rift DK1 KickStarter backer reward units and running with them in 2013, it wasn’t me who convinced the MD to originally support the campaign in 2012. VR is powerful but can’t do time-travel with it (although Lucus Rizzotto will claim otherwise).

The final deployment of the multi-channel projected construction site training simulator at BLSC

However these two units arrived towards the end of my initial 6-month contract and turned everything upside down. I had been originally hired as Project Manager to deliver the next “3D team” solution, an updated version of a multi-channel projected construction site training simulator. This was first deployed at Coventry University a few years before and was being repeated, albeit with updated content for Australian regulations, for the BLSC (Building Leadership Simulation Centre), in Melbourne.

At the time the team was small, consisting of myself (PM), a senior designer, a senior artist and a senior developer. We worked with an art outsource agency for much of the 3D content but the final “timeslices” were positioned, built and configured in-house.

The project was all-consuming but once done, involving the senior developer having to travel to Melbourne, Australia for final alignment and performance checks, literally a couple of hours work, we were without the next project. But the Oculus Rift DK1s had arrived and we’d played about with them a bit.

The first of what would become many VR headset headshots — the DK1 arrives

At the time, the company (known as Makemedia then) was a web company first, 3D company second. The web-side were working with RS Components to create the online user tool for electronics, “Design Spark”. Client stakeholders would often come through our little area, initially laughing at us looking goofy in headsets but were soon keen to have us build something for them. But what?

Thankfully the company were planning to announce their presence in China in early 2014, launching their electronics components catalogue at a trade show “Electronica Productica”. So we started talking about how we could use VR to make something unique for their booth to attract the crowds. But again, what?

We already had experience of Unity and had been building screen-based 3D applications with it for some time. Early, rough VR SDKs and plugins were available but this was back then when it was all tethered and a fairly high-end Windows-based [gaming] PC (at the time) was required. It was 3DoF, no controller or hand-tracking was available, everything was Xbox gamepad-based.

It was decided to build a multiplayer racing game, set within a cityscape styled as oversized electrical components to create skyscrapers and cables becoming the tubular race track. Based upon a Unity spline plugin tool, a track was made and comfort tests carried out, whilst the artist worked on the environment and cockpit. With no up and no down, no horizon to visualise 360º rotation, stomach contents were kept where they should be.

Tentatively adding the environment, we jumped in again to see if it was any less comfortable. By now we also had the final cockpit 3D assets in place and overall, the level of comfort remained high. I wrote (& got paid!) about the design aspects for Polygon a few years later, after a longer featured blog post originally on GamaSutra.

So as not to dwell on this too long, suffice to say our first VR project deployment went off without a hitch (start as you mean to go on etc). Four high-end gaming PCs, two laptops to run the spectator screens and four Oculus Rift DK1s were sneaked into China, taken to the event under lock and key, setup, operated and returned to the UK safely. Overall the installation and experience was such a success, other booth operators were unimpressed by the audiences it drew and the event organisers had close eyes on the numbers, as the photo below shows!

Ready to jump on crowd control

The Nu[clear] Power Generation

Another of the early explorers of immersive technologies with the studio was EDF Energy, as part of their construction of Hinkley Point C (HPC) nuclear power station and the various training content necessary internally, and externally.

3D render of the finished HPC nuclear power plant

In 2014 they were busy finishing off fit-out of Cannington Court, a converted 12th century monastery in Somerset, near to the construction site of HPC, as a centre of digital learning excellence. For the next 8 years the studio worked with various facets and departments within EDF Energy creating a range of immersive learning experiences for employees, using the full gamut of the scale, from tablets to web to AR and VR and bespoke physical installations.

The first project adopted Oculus DK2 and Razer Hydra controllers to allow users to construct the internal workings of a nuclear power primary and secondary loops in a gamified experience. Thankfully when HTC Vive was commercially launched in 2016, this could be upgraded to support a fully roomscale experience that travelled around various events aimed at increasing awareness of STEM employment opportunities for future female engineers.

Questionably titled “Pretty Curious” STEM education campaign

Over 20 projects deployed covered aspects of the fundamentals of nuclear power generation, smart meter installation and enterprise power sales toolkits to name a few. Building content with Unity allowed projects to easily morph from a multi-user tablet experience for training to a physical arcade cabinet single player experience installed in Glasgow Science Museum. This was so popular and well received, EDF Energy then commissioned a second version in upright cabinet form for their Cannington Court staff room. The most recent installation there saw the revamped HPC Visitor Centre reopen post-lockdowns with an array of interactive physical learning experiences.

The Built Environment

“Hard skills” in VR are always an obvious choice for training scenarios, with the benefits of placing learners in dangerous simulated environments safely, enabling virtual access to plant assets too large or costly to put in a training yard and engaging with a digital workforce in new, exciting ways.

After BLSC the studio looked at how to utilise VR for similar outcomes or other use cases. BLSC was a costly project, requiring expensive projectors, live actors and a lot of space to operate. With VR, that could be recreated in the headset fairly cheaply in comparison.

The final BLSC installation

At the time, the BLSC content was built for Presagis Vega Prime, full flight simulator 3D database engine software, so had to be converted for Unity use. This gave us a lot of content quickly, which we could use to start exploring VR environments and uses. Over the years, having created so much content around the built environment (term we use to cover AEC, utilities, transportation and infrastructure), the studio has amassed a huge library of related 3D assets to repurpose. The BLSC timeslices and system has also since been upgraded to be fully Unity-based, avoiding costly annual licences and allowing the system to operate off a single monster PC/GFX card rather than four PCs as per original design.

Whilst the HTC Vive had launched in 2016 with room scale and tracked input controllers, when we got early access to the Oculus Touch controllers before launch, we started to explore natural hand gestures and input afforded by the capacitive capabilities they offered.

This resulted in a prototype for Vodafone, that went on to become Working at Height, which won the AIXR VRAwards 2019 ‘Best Use of VR for Training’ award. Initially the prototype was designed to test out climbing in VR and whether it was a realistic sensation useful for training purposes. This was tested in-house, within Vodafone and at many events we had booths at. London Build event specifically saw a group of bored, drunk builders loudly informing us “VR’s bullshit, it’s not real” at 10am. However we soon had to help one gentleman out of the headset and prise his sweaty, locked-frozen hands off the controllers when he was halfway climbing up the bullshit virtual mobile phone mast. Some words were eaten shortly afterwards.

Vodafone ‘Working at Height’ roof-top mast

This prototype provided useful for creating the full, final experience. We realised we didn’t need to make people climb a full-height tower (25m) as we weren’t looking to train them how to climb, just give the sensation of doing it and it also took way too long for most people. The full mast became a rooftop mast with about 10 rungs to provide the experience quickly, to focus on the core area and outcomes of the content, raising empathy of the maintenance crew managers.

After exploring soft-skills for Severn Trent Water (more on that in another part), they wanted to look at improving posture of manhole maintenance crews. This turned out to be one of our more complicated, yet interesting experiences where we had to extrapolate a full human posture from three points of measurement/reference only (headset and two tracked controllers). IK has been used for years for character and avatar animation and is fine for use but we needed that extra level of finesse to be able to measure, rate and feedback to users on their specific lifting posture and movements.

We tried out all sorts of additional sensors strapped to various parts of our bodies but in the end just brute-forced a solution of our own to keep it simple and scalable for the organisation to use effectively.

After approaching us to help with some Microsoft HoloLens development in Unity and realising it’s tough for a beginner dev, we ended up building a number of multi-user interactive schedules of works displayed in AR for pre-sales activities. These were so effective, they lead to the formation of a JV between the studio and Keltbray Group, a large construction organisation in the UK.

This has led to a number of VR training applications to be created over the past 3–4 years but more importantly, a number of unique immersive tools for construction training, planning and rehearsal. SkillShield and Presentive were borne out of necessity at the time but have since become valid products in their own right.

Early mock-up for some promo materials of ‘SkillShield’ irrefutable learner video evidence tool

Getting Interpersonal

Everyone and their dog has made some form of VR prototype around construction or health and safety, the nature of the technology naturally aligns to these kinds of experiences for hard skills. However it was when we first met members of the L&D team from Lloyds Banking Group when we first dipped our toes into what is now the biggest sector for immersive learning experiences — soft-skills, or interpersonal skills training.

They really liked the idea of using VR but weren’t sure where and how finance could utilise it effectively. We eventually agreed with them to build a short concept experience around some of their existing training content, having a difficult conversation, to allow them to pilot, measure and build internal awareness of the possibilities.

Early soft skills VR training proof of concept screenshot

Suffice to say it was a success and three more projects were signed off over a couple of years, growing in complexity and length with each release. From 15 to 150 to 1500 lines of dialogue of branching narrative, from one to four main characters featuring, these were rolled out on Oculus Rift initially but scaled when the enterprise Oculus Quest was released in 2019.

Dealing with empathy and embodiment, these experiences pushed the team in terms of pipelines, design and eventually led to the studio creating their own processes for scripting, character mo-cap and animation to ensure efficiencies in development allowed for more time budget to be spent on the scenarios and learning outcomes. These are continuously being enhanced and improved to pass those efficiencies onto the clients and budgets.

After his first appearance in one experience, the character Jonathan (who was having a really bad day) had to have a cameo via a telephone call with the learner in a follow-up project. So many people had connected with him and empathised with his situation, but the training content hadn’t provided an outcome for him as part of the narrative. To give them closure, we scripted a call from Jonathan to cover-off how his life had improved and he was back on the right track, after his earlier interactions with the learners.

Being big fans of Mel Slater’s work at the university of Barcelona, and subsequently Sylvia Pan’s at Goldsmiths in London, both around virtual humans, empathy and embodiment, we took this a stage further for the experience created for Severn Trent Water. By swapping characters and hearing internal monologues and perspectives, Coaching was a powerful piece nominated for a few learning awards.

Store cover art for ‘D&I Perspectives’

One of the good things to come out of The Drawing Board, which had been setup as an internal R&D skunkworks and the first ventures into understanding productisation and creating off-the-shelf content, was D&I Perspectives. Developed in partnership with a global financial institution, using four real world stories from their staff, D&IP allowed the team to experiment with low-cost live actor capture and playback in VR, as well as exploring how more traditional elearning illustrated animation art styles would work in VR. Wrapped by a framework of an hour-long session on microaggressions in the workplace, it’s a powerful discussion starting point experience, now available on the Quest App Lab Store to try out.

Standalone

Evolution of Samsung GearVR to Oculus Go

First there was the tether. Then we stuck our phones into Cardboard, or plastic, but that was a bit crap and put a lot of people off VR (although it was super cheap and accessible for those looking to get into VR development from less-affluent backgrounds). Then bits of phones were put into all-in-one devices to at least remove the faff of putting a mobile phone handset in, but it will still 3DoF and didn’t do positional tracking. Then we (they) stuck cameras on the sides of the 3DoF headset and lo! they became 6DoF full VR devices.

Tethered VR was always full of pain and friction points; external sensors to plug in, HDMI cables, USB3 ports being good enough, trip-hazards, tripods, faff faff faff and barriers to adoption at scale. The benefit of tethered VR of course is the powerful PC it’s attached to, capable of rendering far greater detailed scenes and geometry than the current standalone device chipsets can only dream of, but with the same level of interaction possible, for many, the drop in graphical finesse wasn’t so important.

HTC Vive : VK1 > Pre > Pro > Focus

When standalone devices first started appearing commercially in 2019, a lot of our conversations suddenly got a lot easier with enterprise clients. All the pain points above were gone, although a new set of their own were added (more later), and the costly PC necessary to drive experiences made VR suddenly a lot more appealing, to home users too. With a much lower price point, although as suspected and recently confirmed, some devices were undercut to enable market domination, hiding the true cost of devices or making others look more expensive in comparison.

This also meant new audiences were picking up VR for the first time. They might have tried it here and there at events or trade shows, or even in a VR arcade with the kids, but as shown in recent talks, whole new non-techy, non-early-adopter audiences are buying standalone VR headsets. This means devs have to cater for these audiences or risk alienating them or their experiences if they don’t know what to do or how to do it, if it’s their first time in VR on your app.

The early-adopter hardcore VR user was typically into gaming and had a high-end gaming PC already or upgraded their graphics cards (until recently for a spell, at great cost) and was able to figure out this new medium and input. Typically the early focus was on gaming, rightly or wrongly, which initially labelled VR as a thing only for gamers or techie nerds to the rest of the world.

Thankfully, although we’ve lost Oculus Share and whilst there are 3,000+ SteamVR compatible titles (although many are little more than a tech demo), places like SideQuest and now AppLab exist to allow devs to release other experiences beyond just games to wider audiences beyond the tightly curated Quest Store. Of course there’s Viveport for PC, Vive Focus 3 & Flow (although it supports all other major headsets too) and PICO are building their own consumer offering with Neo 3 Link standalone device and storefront.

The world is starting to really wake up to the potential of VR for a variety of use cases; it took a long time coming but once “we” stopped pushing games front and foremost, traction started happening across all verticals.

From a work / enterprise perspective, being bits of computing kit, we had to do a lot of work and education to IT and security teams about how these things could be managed and operated safely and securely within the corporate networks. After a couple of false starts, and unfortunate sunsetting of some services, enterprise management of standalone devices has matured somewhat with a range of 3rd-party MDM tools out there to pick and choose from, treating devices like corporate mobile phones, laptops and PCs, locked down and secure. But still, we’re a long way from it being an easy path of onboarding for many organisations and studios are still having to do a lot of hand-holding and sales on the behalf of the hardware manufacturers to enable adoption at greater scale than a couple of devices for a pilot or PoC.

Events

One thing I hadn’t ever considered or really wanted to do earlier in my career was be front and centre, out there in front of cameras or press, promoting titles or the studio. I was much happier in the QA basement or behind a project management task list, getting things for others to talk about made on time, within budget and scope.

VR though ignited this passion in me and a desire to tell everyone and anyone who would listen about it. And so I became the figurehead, spokesperson and external face of the studio over a period of a few years, mostly initially driven by the Radial-G KickStarter in 2014, as someone had to promote, demo and make people aware of it to try to make it achieve it’s goals (it didn’t, more later).

One of the things I love as much as VR is beer so it made sense to use a nearby public house as a testing practice ground to start being able to talk about VR to a crowd of (mostly) interested locals. So with slides in-hand and some PCs and DK2s and GearVRs setup, I gave my first talk about VR (history, how it works, use cases, future etc) to a room of about 20 people upstairs at the Elephant & Castle pub in Lewes, on a cold drizzly night in Feb 2016.

My first VR talk, upstairs at The Elephant & Castle pub, Lewes, UK

Turns out, speaking in front of people isn’t so bad when you’re passionate about the subject and whilst I got better at rehearsals and being a bit more scripted, most of the time I ran with the topics adlib depending upon the connection to the room at the time. I pride myself on my no-bullshit approach though, don’t ask me to sell things or push marketing speak. Honesty, truth, transparency and validation is key to winning the audience over.

With my first talk under my belt, I went all in for the next one, being asked to give the keynote opening presentation to set the scene for TOMTech Storyhack event, as part of Brighton Digital Festival, at The Old Market theatre in Hove. Suddenly there were about 100 people in the audience and I had a stage, but more about that later.

1st stage & opening keynote for the vrLAB StoryHack event

As of today, I’ve given 50 talks and been on 40 panels since 2013 (plus all the rest) but nothing has yet matched the talk at IT.Weekend 2018 in Kyiv, Ukraine. The biggest stage and audience, flashing lights, smoke, mirrors and a 360º central rotating stage speaker “reveal” section made it a surreal experience. I pulled a reveal stunt, mimicking the well-known Ready Player One Wade Watts *sic headset photo, ruined by the fact that the ops team started my slides on the second one, so it ended up making no sense. And the fonts were wrong (don’t use unique fonts in slides for talks folks). My heart goes out to all the amazing people I met there in Ukraine, I hope you get your country back soon.

Lights, music, smoke and a rotating 360º stage entrance reveal

2018 was also the busiest year to date for events, where for a period of time we were doing two a week on average (this included client internal demos and discovery sessions). This was peak excitement around the next evolution of VR to a degree, with standalone devices being prototyped and dev kitted out to the studio.

Despite seeing a massive drop off this year, mostly because I’ve been saying “No, have you got any women on your manel?” to a lot of organisers to give other voices a chance of being heard, the tally sits at 400+ for all the interviews, podcasts, quotes, blogs, talks, panels etc. I’ve just been trying to educate and evangelise the potential benefits of immersive technologies but it did amuse me to be listed as 12th most influential person in VR at one point, apparently higher than Palmer Luckey and John Carmack. Gotta farm those hashtags people.

I started off years ago working for an elearning company and sometimes had to go to Learning Technologies show in London each year. I never really enjoyed it as it wasn’t my area of expertise (at the time QA) and LMSes and metrics tools are pretty b-o-r-i-n-g. When I left that world to move into the games industry in early 2000s, I thought I had left all that behind, gleefully. It’s kinda funny then to see ourselves return to the show for the past 5–6 years, in ever-bigger-booths and presence. It’s become the biggest event for the studio, generating a large proportion of the leads for the following year for projects and new clients, so pretty important too.

I love it now, I can ignore all the LMSes still and focus on the company booth. LT22 was my last face-to-face event for the studio, back in May this year (unbeknown to me at the time) and it was also our greatest, and my best organisational result. [EDIT: Turns out it wasn’t, as ended up having to attend Learning Live 22 in mid-September too.] Having a booth built for us for the first time felt like such a luxury after years of schlepping up to London in a van full of TVs, stands, desks, banners and whatnot. This year, overseeing the build (& being politely told to go get some lunch i.e. fuck off out their hair), just seeing it come together and everyone else’s faces as they rocked up with laptops and headsets ready to go, was a fantastic feeling and sense of accomplishment.

Learning Technologies 2022–1st built booth, a luxury

As I write this, knowing it will go live late September / early October, am organising and working with the lovely peeps at Creative8 once again to build a smaller version of the booth for World of Learning 2022, where the studio team minus me will be running the Immersive Learning Zone. Be sure to go say hi if you’re attending on the 11th or 12th of October.

Ultimately though events are about one thing for me — community, and the VR community has grown, shrunk, changed and morphed over the years from the early DK1 days, but I love it and the members who consider themselves part of it, no matter their area of focus.

I sadly had to miss out on the first couple of Oculus Connects as at the time, the studio / team was small and so was the purse. Being in the UK, it was simply too greater cost at the time to pay for flights, hotels etc without knowing beyond knowledge, what would come of it. It still smarts a bit that I had to miss OC2 due to cost but we bought a gaming PC and DK2 for Craig Charles. I did get to see Red Dwarf being filmed though and meet the legends of the show, so swings and roundabouts…

Installing VR at Craig Charles’ house, who was surprisingly tall

I did get to go to OC3–6 and am sad that the pandemic has made them all virtual since, even this year still. The physical Connects were always more than just meetings, talks and learnings, it was a chance to meet with the rest of the VR dev community, share stories, make connections, and meet face-to-face many you’d only ever chatted to over social medias, especially being UK-based and travelling to the US for them.

From meetups with a few people to day-long events with 100s of speakers, VR events have always been great fun, getting to meet and see what people are working on, experimenting with and sometimes, secret behind-closed-doors stuff. The very first VR Brighton meetup in 2014, where I backed the Altergaze KickStarter for a 3D-printed Google Cardboard with amazing lenses, where there definitely wasn’t an Oculus Rift HD Prototype, to a later meetup where Tim Aidley showed his DK1 // PS Move modded setup to create a very early roomscale exploration experience, those early days were magical. Also getting to try out Oculus Touch for the first time, trying out the HTC Vive VK1 and TheBlu (my first genuine sense of presence in VR), trying out Santa Cruz, playing batshit VR game tech demos on the expo floor, these are just some of my favourite things.

Early roomscale VR with a DK1 and PS Move controller

We will continue to in-fight over terminology no doubt and it is more welcoming now than it was. But there’s still a lot of work to do for inclusivity, diversity and equality. I’ll continue to do what I can in my new role once I understand the boundaries a bit better but I’ll continue to call out manels and all white male speaker line-ups though.

Fun & Serious Games

So after that crazy China trip to run a four-player networked racing game on DK1, the popularity of that demo sparked a lot of interest internally, with the client, and then the early VR adopters. Also with a couple of execs looking to get on the VR wave, who wanted to setup a JV to fund and develop the game.

Something special about seeing your game [demo] as the main header image

Radial-G was born out of that experience, with an unlimited single player hotlap demo released onto the Oculus Share site for DK1+ owners, a place devs could showcase and highlight game ideas. I miss Oculus Share. SideQuest does a good job but there’s a lot more noise and copycat content these days. I would later find out that the OG Sony Morpheus (what became PlayStation VR) team in London were keeping an analogue lap time flipchart-based leaderboard in their office, manually updating their times from the single player demo.

Thanks to the guy who designed the logo (sorry, it was a long time ago, I’ve forgotten his name now), Radial-G gained the : Racing Revolved suffix. Genius, I doff my cap to you sir. That single player demo launched a KickStarter in 2014, which although ultimately wasn’t successful in reaching its funding target, certainly made the early-adopters and VR press aware of the title in the works. The KickStarter was a lot of work, 24/7 to respond to global backers. I wrote two post-mortem (pt.1 & pt.2) blog posts about it at the time for VRFocus. We did loads of events around it, including the first VR in a Bar where I met Sammy and Bertie of Virtual Umbrella for the first time (this dynamic duo would become regular faces around the studio as we worked with them for PR, marketing and many events over the years).

We also kinda broke the internet when, after having seen Shu arrive at Develop conference in Brighton that year, and tweeting him to come try out the demo, a photo of him playing on DK1 using, (as we all know the only controller that worked at the time,) an Xbox gamepad upset a few Sony fans. Anyway, we got a PS4 and Morpheus dev kit out of it and the road to Sony PlayStation VR was started.

Shuhei Yoshida, then President, Worldwide Studios, Sony PlayStation, plays ‘Radial-G : Racing Revolved’

VR dev was hard during the early years — Unity was unstable, SDKs were rough, we had no visibility of when consumer devices would launch. 2014? 2015? 2016? Facebook bought Oculus in 2014, which although is a pretty marmite discussion point, showed that VR was something to be taken seriously and saw many other companies get involved.

We launched the first build of the new-look Radial-G : Racing Revolved on Steam Early Access at the end of 2014. Pressing the big green [LAUNCH] button was exciting at the time after having blindly hoped each week before then we would get picked out the Greenlight queue to be able to do so. (We were also requested to build a version of the demo for the Oculus (pre-Facebook days) booth for DK2, with the new SDK when our units hadn’t arrived yet so again, were blind doing so.)

As it turns out, it wasn’t until March/April 2016 that the consumer launches for PC VR happened. We’d spent all our dev budget in 2015 trying to get as much game done and ready for launch, which was still mostly an unknown future date at that point. Thankfully, a very special person reached out and helped support us with some final launch polish budget and an Oculus Rift Launch Title store slot (& some base spec test PCs).

Our little booth at the Oculus Media Game Days ahead of Oculus Rift launch in March 2016

Two days in San Francisco for the Oculus Media Game Days flew by, before hanging around a bit longer for GDC that year. Up until that point, our events where we’d demoed the game consisted of trestle tables, pubs, small-scale indie pop-ups. OMD was something else we (I) had never experienced before — professionalism. Oculus after Facebook always do very well orchestrated and slick events, from setup to booth assistants to food and guest management. Back-to-back 30 minute slots with the gaming press was exhausting but at launch, getting listed in Time and Verge Top 5 Oculus Rift Launch Title lists was totally worth it.

Radial-G : Racing Revolved with physical box for Sony PlayStation VR

The game did OK and generated enough revenue to cover the costs of the resumed Sony PS VR port. A deal was struck by the JV with a 3rd-party publisher to create physical versions to go into stores. However we weren’t party to that deal or the specifics and we still to this day don’t really know how well it did. The JV seemed sensible at the time but over the years the relationship became strained and trust slipped away on both sides. Similarly we had no visibility of the deal that was done with another 3rd-party publisher to create the Quest version, Radial-G : Proteus, something that the original team had nothing to do with. It just appeared one day… It’s a good port but it shows the limitations of our initial limited budget and time available to create layers of depth to the game we always wanted to.

But, Radial-G put the studio on the map. It gave us an amazing thing we could talk about publicly, something we can’t sometimes do with client work depending upon agreements and contracts. We learned a lot and we gained a lot and whilst there’s many things we would do differently now if we could, the title will always hold a special place in my heart as a seed of opportunity that blossomed a much larger studio and a new game out of it.

After effectively having Radial-G in our lives on and off for 3–4 years, we wanted something light, bright, colourful and stupid after all the dark, moody, bangin’ techno sci-fi. Also this would be 100% our own title developed and published as the studio, under the studio name, no press-confusing alternative names and representations this time.

Early concept logo board mocked up for the prototype demo

Warp Lands was a concept the Creative Director had rattling around his head, after playing games like Bishi Bashi and Mario Party. Multiplayer, short, simple to pick-up, tricky to master, all these sorts of things but with a stupid lilt to proceedings. We built a simple prototype concept demo that showed the core idea of playing around the metatable and transitioning into the table for a minigame. We had early access to Oculus Touch controllers and wanted to build something that explored how more natural hand tracked input could give and what type of interactions we could do with them. Thanks to our special friend once again, full funding (albeit comparatively small scale) was achieved mid-2016 and by late-2016, we had a team of 6–8 peeps working on it full time.

A large wall in the studio was marked out into quarters and post-it note filled one of them, covered in two/three-word scribblings of minigame ideas. Star stickers were put on post-it notes as we voted for our favourite ideas, the winning ones being moved over to the next quarter, the prototype area. 2–3 days saw a prototype for each winning minigame idea knocked up with dev art and placeholder assets. It didn’t matter what it looked like, it was more about what it felt like to play, and most importantly was it fun?

This process continued to alpha and beta and final until we had 16 minigames across 4 themes to create our initial launch metatable game board. The punt of the week paid off when after locating Brian Blessed’s agent, he agreed to be the voice of our beloved Grand Sensei character, bringing his special British bombastic style to proceedings. (We actually initially wanted Bob Hoskins to do the voice but that was impossible obviously.)

Unfortunately our special person was off work and we’d misinterpreted the contract. Thinking we couldn’t announce the game until we were told we could, or other parties did it for us, we failed to build up any awareness or anticipation for the game ahead of launch. We were also really struggling to know what to call it. As the design had progressed, the initial name Warp Lands didn’t make much sense or relation to the game now. Two weeks before launch we finally settled on Loco Dojo and realising only we were in control of our own announcing destiny, rapidly set about creating social media channels, posts, official website, trailers and getting the word out.

The initial ‘Loco Dojo’ logo

The game launched in April 2017 on the Oculus Rift Store, six months after signing the contract, getting the money in the bank and hiring the team to build it. We were, and still are, super proud of what that team achieved in that time with what was a pretty small amount of money in comparison, without crunching or making themselves ill over it, or impacting their families.

Three months later we re-launched but on Steam, wanting to give a window of exclusivity but also not having to go through the Greenlight process anymore. One of the key learnings from Radial-G was to keep your game code and platform code neat and separate so adding and changing platform support was a lot more straight-forward this time around.

People love the game when they play it. It’s wholesome to hear big belly laughs as they play. If I’m ever feeling a bit down, I just find a YouTube video of some people playing together and remind myself of the stupid fun we brought into the world.

We started the Santa Cruz port in 2018 as soon as we got the dev kits. Unfortunately we found out it wasn’t going to get through the curation process for launch of the Quest in March 2019, so we parked development to focus on other things. The game was doing really well in Location Based Entertainment (LBE) VR Arcades, something we hadn’t considered or factored into the design (even though there’s an unreleased video interview of me telling Mike Diver // Vice at the time in 2014, that VR Arcades were the future). Loco Dojo Fiesta saw a rapid version deployed to AlterEyes with just three randomly chosen rounds and none of the metatable game play elements, to create a tight 15-minute experience.

Then COVID happened in March 2020 and as countries around the world went into lockdown, our LBE revenues dried up overnight. It took 18-months or so for the world to reopen enough for LBE numbers to bounce back but many venues didn’t make it, or changed focus and types of experience they hosted.

Thankfully, just before lockdown, we met another special person who fought our corner internally to get Loco Dojo Unleashed greenlit for the Quest Store. After our own internal pitches to get the funding to continue development, the game finally made it to Quest in October 2021. This time we knew we could talk about it and we made sure we did well ahead of launch, to give it it’s best chance of success.

Launch ‘Loco Dojo Unleashed’ store art for Quest

Within three weeks we had sold more copies on Quest than the Rift Store in 5 years. Within three months we’d sold more copies on Quest than all PC VR stores in 5 years. The Quest marketplace and user base is certainly much larger, but also very different to the early-adopter PC VR users. This is reflected by Chris Pruett’s talk at GDC this year about the Quest ecosystem, which ended up featuring my tweet of our unit sales on Christmas Day (with important numbers cropped off, sorry).

I’m sad to be missing what will be Loco’s 1st birthday on Quest on October 7th this year, as that’s after the end of my notice period. But I do know the game smashed all our internal targets and expectations and turned the necessary heads in the right direction to start looking at the next thing. We’ve pushed Loco as far as it can go without adding more content, and a lot of work under the hood is needed to be able to add more content. But the game is now super easy to get into, with a variety of single-player modes (which it was never designed to do) and ways to play together, either online or locally. Later in 2022 and into 2023 it will appear on other standalone devices.

I’ve been working on the next thing up until the end, whether it be Loco 2 or something entirely new, it’s not my place anymore to tell you what it is. That is very much in the hands of one of the people who will replace me, and I know they are going to smash it in making sure you know about it.

I can continue to talk about some of the other gaming and LBE things we’ve made over the years that have been great fun to work on. One of them being the multiplayer competitive potato harvesting VR game we made for McDonald’s. Yep you heard me.

The VR Truck all setup for the first time for the media day event

Late-2015 we were approached by a creative marketing agency working with McDonald’s UK for a new campaign that would launch in 2016 to promote and raise awareness of their young farmer training programmes, “Follow Our Foodsteps”. They wanted a series of immersive experiences to be installed into a truck that would drive around the various agricultural and countryside shows across the UK, promoting 100% British beef, organic milk and free-range eggs, as well as the need to encourage and enable more young people into farming to make up the 120,000-person deficit needed to keep supplies growing and delivered to restaurants (bearing in mind this was BEFORE Brexit, COVID and 2022 Tory party collapse v4).

We agreed upon a series of simple touchscreen quizzes, some 360º video footage to be shown on GearVR and a hero VR experience, using Oculus Rift. We had access to the DVTs, EVTs and PVTs prototypes and dev kits but the launch of the CV1 Rift was a bit unclear in terms of exact date and availability in relation to when this campaign was going to launch. But time was tight and dev on we must.

A project this size had a number of parties involved, from the distribution firm who provided the 14m truck, to the fit-out crew, the creative designers, PMs, event crew and us, building the digital content. At the time dedicated 360º cameras weren’t really a thing so two 180º cameras were strapped back-to-back and stuck on poles around dairy, pig and chicken farms and McNugget factories. Thanks to Tim at what would become FutureVisual for the assistance.

The biggest challenge here after post-processing a few things out was actually keeping the Samsung Galaxy phones cool enough on a hot British Summers day during the events so they didn’t shutdown or kill the batteries. Oh and reminding the event staff not to hold the GearVR headsets by the straps with the lenses pointing up, towards the unrelenting sunshine which would happily burn a mobile phone screen in seconds if given half the chance.

The hero VR experience, “Top of the Crop”, was where most of the effort went though. Turns out driving a virtual tractor in a straight line following a virtual harvester along a path at a constant velocity was actually quite hard to do. Harder still was ensuring 3 high-end gaming PCs would operate stably off a generator power supply in the middle of a field. Thankfully we only had to rely on that once as turns out most of the sites had decent power provided to exhibition plots.

In-game screenshot from inside the virtual tractor cab

The other main challenge was determining how we would get the Oculus Rift and sensor extended 10–15m from inside the truck where the gaming PCs were in the tech room, to the physical mock-tractors the users sat on to play, located outside the truck, in the sun commonly. We must have tested nearly every USB3 and HDMI extension cable to find a handful of makes that would actively extend the signals in a way that the Rift and PC were happy with to work as normal.

The rest of the mock tractor comprised of a real tractor seat and the official steering wheel and pedals from Farming Simulator. The units were made of wood, with the seat and everything else, weighed a lot, taking four people to lift them out and in every morning and evening to setup and pack down the “booth”. Gaming steering wheels and especially the pedals are designed to be used at home, not in a field by excited farmer kids and adults, with muddy boots and wellies on. We learned a lot about the importance of spares and how to replace things efficiently so as to reduce downtime as much as possible during the live shows.

The final challenge was the headsets themselves. Although the Rift launched in March 2016, we can all agree the pre-ordering and shipping was a bit of a clusterfuck at the time. We had units ordered but no way to boost them up the queue or even a real sense of when they would arrive. Thankfully we were able to swing 6 sets of PVTs just before the media day and had special permission to be able to use them as the consumer units hadn’t been delivered yet. It does mean many photos have “engineering sample” on the headsets visible but thankfully it’s faint and wasn’t that obvious.

I think this was the only time we really used the Oculus Rift remote control for anything but we put it to good use as an easy event booth operators tool. The central button started/stopped/reset the VR game. Pressing left or right would swap the gender of the farmer avatar you saw for each player, and pressing up or down would realign the VR camera for each player. Everything needed to smoothly run the experience was controlled from that one little puck. Of course, there were plenty of spares of those too. We’ve had so many Rifts over the years I think I cleared about 50 of these remotes out of a drawer when we moved the studio one time. Only the OGs know what I’m talking about ;)

The 2nd iteration of ‘Reactor Runner’ in the upright cabinet form

Many of the other serious games projects involved physical elements too. Mentioned earlier, the ‘Reactor Runner” game for EDF Energy was adapted from a team-communications game on tablet into an amazing single player physical arcade cabinet, first in a pinball-style table for Glasgow Science Centre, then as an upright cabinet to go into the staff room at Cannington Court. The game itself was re-styled and refactored to gradually introduce the three roles before allowing the player to be the super-operator for the power station. This served its time for 5 years at GSC without fault as far as aware. Must have been those robust ‘Hard Drivin’’ gear selectors we used from the original arcade cabinets as control rod levers.

We also took on a few projects to port titles from one platform to another, for other studios on titles that weren’t our own IP. Two of these titles is the early VR experience ‘Apollo 11’ from PC to Samsung GearVR and Google DayDream, and the follow up experience ‘Titanic VR’, again from PC but this time to Sony PlayStation VR. We’re proud of the considerable optimisations undertaken by the team to get these titles working on far less-powerful hardware, requiring far more efficient development approaches to do so.

Having started as a project manager, over time as my role morphed freeform into whatever it became at the end as Immersive Partnerships Director, (a title we made up to look good when I did talks,) I became less and less involved with day-to-day production of content and projects. We also grew and brought in far more skilled and dedicated professionals for production and project management but it did mean I was sometimes a little disconnected from what the studio was working on, once the biz dev aspect was done.

Promotional photo for ‘Chaos Karts’ when it opened in 2021, London, UK

My biggest regret is not having more to do with the most recent LBE deployment the studio did, which took the form of the awesome Chaos Karts. Simply (?) put, it involves a big empty space, a networked array of projectors, electric go-karts, a tracking system and a Unity-based game running everything under the hood. It’s far more complex than that of course and it’s an impressive tech stack that takes customers from booking to personalisation of their karts, speed ups, weapons, slow downs when they get hit (or try cheating by cutting virtual corners) and scoring. Am just gutted that thanks to COVID and lockdowns and timing, only two people from the studio have gotten to play the finished thing during install and until a new venue is found, will have to wait until my turn to try it.

Immersive Theatre

I first met James, the then events manager for TOM, and organiser for TOMTech, at a VR / immersive tech and arts event in 2016 (we’ve scratched our brains and neither of us can remember the exact details) and we got on immediately, both having a passion for pushing art and tech in new ways to create exciting, innovative new audience experiences (& beer). Over the next few years, we regularly met up to discuss the latest advances and plan the annual TOMTech vrLAB series of events and showcase that he (we) ran as part of Brighton Digital Festival each year (2016–2019).

We had a lot of fun organising, running and being involved with TOMTech vrLAB and I got to know the building intimately. My most surreal VR headset headshot came from the first year thanks to Marshmallow Laser Feast and the In The Eyes of an Animal experience with customised DK2 “forest” headsets. When it came out on Oculus Go a few years later, I tried to recreate the experience.

Marshmallow Laser Feast: ‘In The Eye of an Animals’ at vrLAB (left) & at home on Oculus Go (right) w/ Subpac

Thanks to ongoing support from Oculus we had plenty of Rift headsets and Touch hand controllers, which culminated in giving sets to the teams who partook in the 24-hour Touch hackathon we also held in during the first year. I ended up sleeping overnight in one of the normal theatre dressing rooms which itself had been dressed to resemble a jail cell, since we were showing The Guardian’s 6x9 life in a prison cell 360º video experience there. That was an interesting first few minutes from waking to remembering where I was.

The Touch hackathon winners with their Oculus Rifts, with additional pizza sponsorship by Unity

We also had regular support from AMD in the form of VR-ready PC loaner kits, so we could ensure those wanting to show their work had stations setup without the need to bring their own (thanks Kevin Strange, AMD EU Dev Rel!). From our side we had four player Radial-G on the main stage screen the first year and then Loco Dojo but the surprise hit was the McDonald’s Top of the Crop setup, steering wheel and pedals too of course.

COVID put a stop to 2020 plans, along with many other things but a little birdy tells me that the original year and discussion, between the arts and tech industry funding reps during the Storyhack component, is what lead to the creation of the Creative XR annual funding opportunity for immersive experiences.

For which TOMTech and the studio teamed up to submit a proposal for the second cohort and were selected to receive £20,000 plus workshop support, for the chance to build upon the concept of Time Machine and the underlying delivery structure, PerformXR.

The ‘Time Machine’ team for CreativeXR

Time Machine was of course based upon the H.G.Wells story and we set about volumetrically capturing Nicholas Boulton as the titular character performing the script written by Damien Goodwin, with our one live stage actor (/stage tech) Katy Schutte delivering a little improv on top to keep the story fluid.

It was wildly ambitious at the time with 4 VR “time travellers” on stage in the same space at the same time together but there were no colocation systems available so had to build our own. The 4 AR “guardians” peeked into the virtual world and together, the 8 immersive audience members collaborated to determine their experience outcome, of which there were 8 branching versions depending upon their performance during the “show”.

The studio agreed to match-funding internal development cost/time but still, the timescales and overall budget meant a rudimentary prototype was possible, as long as you didn’t look at the string and elastic bands holding it together.

Unfortunately the concept was too far forward thinking and wasn’t an easily packageable content experience piece, that could be funded to go into an immersive film festival and due to the size needed to demo it, we struggled on the final marketplace days to run enough sessions to garner much interest in further funding.

It’s great to see people like Brendan A Bradley and Alex Coulombe doing so much in the multi-user immersive theatre space three years later, achieving what we tried to set out and do a few years too early, much more effectively.

Physical Reality

When I first joined in 2013 as project manager, we had one artist, one designer and one developer working as a 3D team within a web company. The business was situated in an atypical Brighton office, a converted house into a commercial property and we were likely working out of what was the lounge.

Early dreams of becoming the next tech unicorn

The company moved about a year later to a much larger premises to bring us all together in one space, which was a big upgrade for the company at the time. As the team grew and we repositioned from being a 3D team to our own branded entity away from the web aspect of the company, we took a space in the corner of another company within the same building.

More space for more monitors mostly, and first immersive technologies demo area

After what felt like squatting for a few months we decided to take the plunge and take a bigger space of our own, to coincide with the funding of Loco Dojo and the growth of the team to make it. New England House was cheap enough and spacious enough but after hot summers and windy winters, we were ready to head back “home” and take over the main office space, this time as our own with the growing team.

Early days in NEH, growing the team for ‘Loco Dojo’ development and more client projects

Unfortunately the building landlords were hoping another company in the building were going to rapidly upscale so when the lease was due for renewal, they gave us a kindly fuck off price, and so we did, to the delightful ex-government tax office building next door, along with a number of other companies presented with similar lease renewal prices.

Revamping our half-floor of Crown House for the main dev area and meeting rooms

Baby blue painted walls and lack of investment made the building feel like a state hospital or school from the 80’s but it was large and spacious, and thankfully cheap as it was a short-term lease, ending in December 2020 when the building was due to be renovated into yet more student flats. However we spent time, money and effort to make our bit of the building ours and feel more modern and appropriate for what we were doing in there.

Then of course March 2020, COVID and lockdowns happened and we switched from a bums-on-seats studio to a fully remote one. Most of what we do and how we do it is hosted online in various clouds anyway, gone were the days of servers and storage in the office, so it was a relatively painless process overseen by the awesome orchestrators facilitating taxis to ferry equipment and whatnot around to peoples homes.

After re-examining what people wanted out of a studio space, like many other companies did, we settled upon setting up a hub space people could visit to work from as necessary, or full time if they wished as some did. I had met U+I, the original developers of what became the Plus X Innovation Hub building, many years earlier at a 5G event and was keen to move us there to take advantage of the coworking, collaboration and crossover between industry and academia offered.

Seeing each other for the 1st time since lockdown in August 2020 to visit the Plus X Innovation Hub

Upon being accepted on the BRITE programme for open innovation, to help the studio understand what productisation looked like, to enable us to start to diversify revenue streams away from purely agency day rates, we signed a 2-year lease for a 12-desk private studio space in Plus X in late 2020.

It’s a fantastic space, designed to meet Californian WELL standards with a great AirSpace rooftop bar area, lots of light, colour and air flowing through the building. I didn’t use it as often as I could but it was always nice to visit for meetings, sorting out hardware bits and sometimes just when I needed a change of scenery from working at home in my spare room, being driven mad by my three cats. But that was two years ago and I can only presume we are re-signing to stay on for a while as we all seem mostly happy with the working options as they are, wfh, flexi or in-studio.

Looking around though and as we’ve searched for new buildings and studio spaces over the years, on average being a company of about 30–40 people, the options in Brighton are generally pretty rubbish, expensive and don’t offer a lot for your money. Plus X might be a bit out of town but it’s well serviced by public transport to get into the centre or out to London etc. Many commercial property landlords could well do with looking at what Plus X has done and take a few leaves out of their book to create workspaces that meet the needs of a modern tech sector.

Lockdown WFH standing desk & riser setup that’s saved me on numerous times in the spare room

As my new role is remote, albeit likely with a fair bit of travelling around meeting studios, am currently pondering what to do. I like working from home and I am incredibly fortunate to have a spare room to do so from, no kids, a garden and the glorious South Downs countryside 30-seconds from my doorstep. But having the ability to work from somewhere else, a change of scenery and a clear line between work and home is always good. I think we can all agree those boundaries have blurred somewhat these past two years and strong willpower is needed to enforce ones own work//life balance.

Enterprise

Avoiding any “worlds first VR” nonsense, the studio was right at the forefront of educating, introducing and deploying immersive technologies into enterprise organisations since 2014 onwards (at least for this current wave).

Early HoloLens and HTC Vive installation at EDF Energy Cannington Court

As covered already, clients like EDF Energy, RS Components, McDonald’s and Vodafone were there at the bleeding edge with us. Many others wanted to get onboard but were unsure how to utilise the technologies within their organisational departments or sector. But there were many who were keen to work with the studio to best understand how they could, what the barriers were and try to find ways around it, successfully or not. Happy to say most of those have successfully found valid use cases for xR within the organisation.

Even to this day, no matter how many newsletters, blog posts, events, webinars etc etc we and countless other studios have outputted, enterprise organisations still need a lot of hand-holding to get onboard with xR. It is getting easier but there’s still such mis-understanding and mis-conceptions around the costs, process and factors around design and comfort to overcome.

Running a Discovery Day for PwC L&D team

Ideally there would be an internal champion, someone who believed in the potential power of the technology, when applied to validated use cases. The studio was approached frequently by organisations who wanted VR but they didn’t know why or what for — deploying technology just for the sake of ticking an innovation box, or keeping up with their competition or because it was cool was never going to cut the mustard and provide value.

It takes a long time, a long lead time from initial discussion to sitting down and defining the scope of a project, something not all studios can swallow in terms of cost or pre-sales overhead. Many studios promised the earth and delivered little, or relied upon gimmicks to get a quick sale and a quick buck. Thankfully, most of those withered and disappeared in the drought of 2017–19, reducing the harm to wider adoption.

Enterprise organisations can also be huge, with multiple departments looking at the same areas without communicating with each other, either through poor systems, silos, internal politics and competitiveness. The studio typically deals with the L&D or innovation teams but xR covered a wide range of departments as something to potentially investigate, depending upon the nature of the business. But then ideally you need all these departments onboard as early as possible so one of them doesn’t become a blocker later on as you’re about to deploy within the organisation.

Just some of the enterprise clients have worked with (missing 4 being The Big 4)

We typically avoid working with marketing agencies and creative service studios looking for a technical partner for a pitch. Whilst the budgets were always considerably larger, the hyper-focused nature of each project until delivered, then forgotten, or late, vague requirements coupled with not being able to pitch or speak to the end client/user, meant we were being asked to spend weekends putting slidedecks together at short notice with no control over the outcome.

At one point we measured the success rate of creative agency pitches we’d been approach to provide late on a Friday for an early Monday morning presentation (that we wouldn’t be part of) and over 2 years it was about 2%. At that point we decided when a creative agency called up, we’d be polite and hear them out but unless they would tell us who the client was and would allow us into the pitch presentation, a thanks but no thanks was their answer.

Learning to navigate the political inter-department minefield can sometimes be a challenge, especially when dealing with global organisations. Sometimes a project could be buried by one territory when it made their own efforts look poor, in terms of quality or value, or country-locale-specific challenges saw reduced impact upon deployment. Once a department has something to show off however, it often leads to others wanting something of their own and a bit of an arms-race can develop.

HTC Vive & Tracker-based 5G rugby experience for Vodafone in their business centre

But that was all just making stuff, getting it into organisations was harder and much more lumpy over the years. Thankfully due to more awareness and more mature supporting systems being available, this is much smoother over the past few years but each deployment brings its own headaches and issues to contend with.

Each headset manufacturer has had forms of enterprise support, from just being more open in terms of device access and control, to creating whole departments and big marketing pushes designed around business use. Thankfully now you can get any device and configure it with a 3rd-party MDM (mobile device manager) tool without being reliant upon the manufacturer supporting, or not supporting enterprise deployments with effective, up-to-date tools.

It’s a double-edged sword for a studio looking to gain exposure and awareness within a sector, overly relying upon one manufacturer for inclusion in promotion whilst at the mercy of the engineering team and whether or not features will continue to be supported in the short-term future, with rapid development occurring elsewhere.

So how to do it effectively? The studio always keeps a straight-forward message: start small and iterate. Time is spent carrying out discovery days where a mix of theory and practical hands-on time with devices and demos allows a department to fully grasp the possibilities, but also the limitations. Honesty and transparency is key to relationships moving forwards, building trust and being known for integrity. Unless you are looking for a one-off project quick-win, it’s worth to build these partnerships up from low-cost proof-of-concept, to pilot to full-scale rollout across the organisation.

Hosting another Discovery Day in the old VRLab studio space

Ultimately you often have a one-shot chance to make an organisation or a department become believers and likely to create more internal champions, who in turn will open doors elsewhere and wider adoption will occur. As the market grows and opportunities increase, beyond the hype and marketing fluff, a rising tide floats all boats in the longer term, why jinx that for a quick buck that damages wider adoption overall?

Membership Organisations

Worth a quick note here at the end of this post about immersive technology membership organisations. Over time, the studio has been part of most of them and like all things, you get out what you put in, to a certain degree.

I appreciate these organisations need funds to continue operations but there’s times when as a studio you feel you have to be a member to be validated but you don’t really get a lot back or out of it. Some are more US-centric, others better for studios located in EU (or UK), some have great deals and opportunities at events for exhibiting or speaking, some host industry awards to drive everyone forwards. Others are a bit cliquey and feel very much like an inner circle if you are on the outside of looking in.

Winners of the AIXR VR Awards 2019 “VR for Training” and some of the team looking spangly

You have to do your research and determine which one best suits your studio needs at that time, or in the near future, and how much you can afford to pay to become a member, in financial terms but also your own time investment. What they should all be doing though is not hiding member lists behind the membership subscription, so that studios can promote and be found easily to help with awareness and lead generation, if they opt into being publicly listed.

xR4Good

Scrolling through some old photos reminded me to write this section and I can’t believe it wasn’t in my initial thoughts of what I’d write about as it’s the most important area of my work to me! Duh…

In all my previous roles I’ve always tried to ensure there is an element of reaching out and inclusion or ways of using technology for good. From incorporating W3C WAI accessibility standards and testing into training content in 2002 to launching a games QA NVQ Level 3 certification for NEETs to enable bridges into the industry in 2007, I wanted to make sure as a studio we had a strong focus on making sure we could help people get into the industry or use the tech to improve lives for others.

The early VR headsets were this big… talking to Coder Dojo : Brighton, Saturday mornings club

The approach started small, related to the small size of the studio, team and available budgets but gradually grew over the years as did the studio, team and financial security to spend on giving back. From doing talks in schools and code clubs, this was relatively inexpensive as it was just an hour of my time here and there but bigger projects, like working with Stay Up Late took multiple days of my time over longer periods.

Doing talks in schools, colleges and universities lead to working with Declan Cassidy (and what has now become the amazing Into Games organisation,) to run a test virtual work experience week with local college for NEETs, DV8 Sussex. Thanks to the brilliant teachings of Nick Dunn at the time, 7 students set about operating a game dev studio from the college for a week to design, build and demonstrate a game they’d made from our brief, in this case a new minigame for what could be Loco Dojo 2.

Pilot virtual work experience week presentation with the DV8 Sussex students and Declan Cassidy

We popped in physically at the end of their first day, amazed to discover that they had arranged themselves very efficiently with a student studio team member each looking after design, art, code, testing, marketing, production and project management. At the end of the week they came and presented their idea to the team and had a working demo to try out. It was better than some of the initial prototypes we’d concocted when originally developing Loco Dojo back in late-2016. It really captured the sense of stupid nonsense we based the game around but also showed a great understanding of what makes interactions fun in VR. I truly hope one day “Not Joust Any Sandwich” gets to live on a fully polished minigame. It’s amazing to see that initial idea blossom into what Into Games now is (you should totally become a Video Games Ambassador if you can spare some time).

At one of the many VR events I went to, I met Paul Richards, who heads up the local charity for people with learning disabilities Stay Up Late & Gig Buddies. I was already aware of Paul’s work through my wife and through her work, via an introduction to Heavy Load, a punk band consisting of members with learning disabilities. We got chatting about using VR for good around his idea of enabling social confidence amongst their members to visit venues for events and music acts.

Paul and one of the Stay Up Late members making the most of having a comedy stage to themselves

Some discussion later we decided to trial filming a venue with a 360º camera to view the footage back on a VR headset to see if a sense of awareness of the layout of a venue, plus what it felt like to attend a live event there, would help. Then I remembered the Google DayDream Impact programme, where Google were sending their monster Jump Odyssey 360º cameras (x16 GoPros strapped together into a 3D-printed network rig) out to good causes to film documentaries and narrative experiences.

The Google Jump Odyssey 360º camera rig

One application and a few months later, a massive Pelicase and box of phones and Google Cardboards arrived at the studio. Meeting Paul and some of the other Stay Up Late members, we talked about filming, what we would want to film to show people, and which venues we should approach to film at first. I taught the members about VR and how to use the camera before setting off to our first venue, the ever-approachable The Old Market theatre (who already ran Stay Up Late disco nights for members).

Series of shots from TOM venue filming

We ended up filming 6 locations around Brighton over a couple of months; TOM, Brighton Centre, Brighton Dome, Komedia, Concorde2 and Loading. We filmed a variety of areas of each location when closed, to be able to focus on layout and location, and then shots of live events and shows (where allowed). You can see all these on the playlist on the Make Real YouTube channel.

Filming at Loading was during Develop game dev conference, where a Stay Up Late member met his hero

For the final location filming, we were joined by the Google DayDream Impact documentary team, who wanted to feature our efforts and story in a film they were making about the programme. They spent a day following a member around doing everyday stuff in their life before meeting with a Gig Buddies ambassador to attend an event at the location. They also filmed us filming the location and preparing the film for people to see and use.

The Google DayDream Impact documentary team film a Stay Up Late member

Unfortunately shortly after that, Google killed off DayDream View VR headset, the Impact programme and recalled all the Jump 360º cameras, so the documentary never saw the light of day and we’ve not even seen an unlisted rough cut or anything. A big disappointment for all after all that hard work and effort that went into it.

Other organisations and things we sponsored over the years include Codebar, a global charity setup to promote facilitation of the growth of a diverse and inclusive tech sector. The local Brighton branch meet regularly, hosted by sponsor organisations to give up some of their studio space for meetings and provide drinks and pizza for the 30-odd members and code coaches. We did this for a year whilst we could with the space in our last location before lockdown. Whilst most of the regulars and coaches were learning web development, there was an interest in how to code for Unity and VR too.

We also sponsored the Meetup fees for the Brighton Indies game developers meetup group, which met once a month in a local pub and weekly for morning coffees. The purpose of the group was to provide community, interaction, support and advice for local indie game developers looking to grow their skills and talent within all aspects of game design, and just be a friendly place to hang out and meet like-minded people, open to all.

Some of the team posing for Special Effect ‘One Special Day’ charity fund-raising event

We also supported the disabled gamers charity Special Effect through their One Special Day annual campaign by donating the days sales revenue to the organisation each year. Whilst never huge amounts of cash, every little helps.

My most memorable, profound experience was my two trips to the Blind Veterans UK — Brighton location, as part of their regular tech weeks, to do talks about and run VR demos for the members. BVUK supports veterans with partial or full sight loss through service. I’ve written about the experience in greater detail previously, which you can read here, but it still lingers with me today the impact the technology had at the time and could have in the future.

A Blind Veterans resident, an ex-radar operator, explores under sea in TheBlu on HTC Vive Pro

As I come to the end of my time at Make Real, it’s great to have to have gotten a project off the ground that I started 4 years ago — putting VR training into prisons to give inmates skills that could be used upon release, to hopefully give them employment and an improved chance of not reoffending.

Literally my last weeks were spent running a pilot at HMP Isis YOI in partnership with Bounceback and Keltbray, to run sessions around construction training that hopefully will be measured and deemed effective by City & Guilds to fund and rollout a wider programme of access and enablement across the country.

Wrap-Up

I’ve waffled on long enough and it’s now October, maybe even November if it took you that long to read, ready to publish this series of blog posts as I prepare to announce my new role publicly. Time to scrub my brain of which company I mean when I say “we”, “our”, “us” etc…

It’s been an amazing experience working at Make Real over the past 9.x years, I’ve pushed for a lot, was given a lot of freedom and the studio team have created some remarkable experiences. I’ll always be thankful for my time there.

Make Real studio summer party 2021 on the Plus X AirSpace rooftop terrace

They’re also really nice people, intelligent, experienced and super-awesome at what they do. Seeing the team grow from 4 when I first joined to the current approx’ 35 full time people and a number of freelancers has been a joy to see. I’m gonna miss those crazy bastards but no doubt they will still be in my life in various forms at events etc as I step into the new role at HTC Vive. Be sure to tap them up to talk about creating immersive learning experiences, there’s many more folk there now far cleverer than I am. They thrive with a challenge of something thought impossible; they always make it possible.

Of course if you’re reading this and have a VR experience you’re looking to bring to more platforms like Viveport, with Vive Pro 2, Focus 3 or Flow support, reach out to me in my new role and let’s have a chat about how we can work together in the future.

Onwards to the random bits and bobs >>>

--

--

Sam Watts
Sam Watts

Written by Sam Watts

10+ years in #xR ( #VR / #AR / #MR ) & #SpatialComputing, 20+ years in video games industry

No responses yet