Zubr ARKit Hololens style augmented reality turbine hologram outside Zubr studio in Bristol UK

VR/MR/AR convergence ever-closer with Apple's ARKit

Here at Zubr, we tend to allow ourselves a wide margin for error when it comes to defining the boundaries between augmented, mixed and virtual reality.

Perhaps more than any other company, we’ve had our AR/VR/MR boundaries mixed and mashed up right from the start – and I think that’s a good thing. Never have we concentrated on one format and then suddenly ‘discovered’ another – it has always been more like “AR/MR content that switches between mono and stereo view!” “AR-enabled positional tracking for VR content!” “A scene that starts in MR until you step through and it becomes VR!” “Mobile AR containing mini VR scenes!”

We don’t think of Google Cardboard as a real VR device, because it doesn’t have six degrees of freedom/positional tracking.

“AR and VR are two completely separate things”

“AR is dead. It’s all about MR now.”

Challenge(s) accepted!

Google’s 2014 Cardboard design featured a camera passthrough hole and 3D tracker token

2014: Bringing Vuforia and Google Cardboard together

After experimenting with the Vuforia AR engine in 2014, we weren’t really interested in making augmented reality. Despite previously working at Bristol’s leading AR technology supplier Kudan, and being well-aware of the possibilities, our hearts were firmly set on exploring what we could do with the Google Cardboard. Excited by the presence of a hole for camera passthrough and even a ‘3D token‘ disc supplied with every cardboard, we started blending Vuforia’s AR and Google Cardboard’s VR capabilities together, something that, to our surprise, no one else was interested in at the time.

One of Zubr’s 2014 experiments in combining augmented and virtual reality

We arrived at a point where we would use AR image targets of different sizes and shapes to position 3D content in space, viewed in true stereoscopic 3D. Larger image targets could even be used to enable 6 degrees of freedom positional tracking for non-AR virtual reality scenes. An animated GIF of one of our early demos of this type posted on Twitter drew the attention of Google’s VP of Virtual Reality, Clay Bavor, and Vuforia’s Head of Marketing, Liz Philips, who was perhaps surprised that we hadn’t built it with Vuforia’s own VR compatability toolset that had only just been released.

2015: Augmentation with depth-sensing devices

To continue the success of the original low-cost Cardboard VR device, Google unveiled the Cardboard V2 in 2015. Though many aspects of the design were improved, such as the lens, build quality and portability; Google had removed the camera passthrough hole on the front. Through doing this, Google had effectively chosen to separate AR and VR, at least for the time being – perhaps in recognition that no developers had created any meaningful AR/MR content with the first Cardboard headset.

Running in parallel to the Google Cardboard, Google had a different team working on the Project Tango – a depth-sensing mobile device geared entirely towards augmented reality. Of course, it wasn’t long before people started exploring how they could bring augmented and virtual developments together.

This is where Microsoft comes in…

2016: Microsoft Hololens woos the corporate crowds

A new type of headset that creates visuals based on the centuries-old technique of Pepper’s Ghost, and depth-sensing abilities derived from the Kinect, the Microsoft Hololens very quickly became the ruler of a new segment of the industry – keenly pushed by Microsoft – called Mixed Reality.

It’s an impressive new direction for head-mounted hardware, and we’re certain that there’s a great deal of mileage in it, but sadly, the Hololens currently suffers from its’ very own Hype Train. Perhaps spurred by the business-class-feeling £3,000 price tag and relentlessly circulated stock images of ‘business users’ reaching out to touch ‘holograms’ in amazement, the Hololens has made incredible inroads to corporate environments, being demonstrated as an effective option for technical training and health & safety amongst other uses.

And, in a clichéd attempt to prove its’ worth to tech-savvy business users – despite having a field of view the size of a business card – most of those stock images you see will include a little OTT. No, not Over-the-Top – I’m talking about the Obligatory Turbine-Thing…

Zubr Microsoft Hololens

A Business Hololens User indulges in his OTT

Meanwhile, as we move into 2017, we can see increasing evidence that the previously-almighty VR hype train is slowing down fast. Sadly, some VR companies depended heavily on this hype translating to real business; and many are now struggling.

So what fresh innovations are popping up to keep everyone interested? Enter Apple.

2017: Apple unveils ARKit

There has been speculation going back over a year that Apple had been planning a major augmented reality toolset, with a number of relevant business acquisitions. In Spring 2017, the rumours turned out to be true – and Apple unveiled their awe-inspiringly-robust trackerless SLAM (simultaneous localisation and mapping) augmented reality system – ARKit.

We had already been testing our content with Kudan’s impressive SLAM system, which works across iOS and Android – so we were quick to try ARKit, and we not disappointed with the results…

“Enables Developers To Create the Most Innovative AR Apps for the World’s Largest AR Platform.”

Naturally, we’re absolutely delighted that robust SLAM for mobile devices is finally here. Needless to say, it wasn’t long before we went full-circle and created a stereo split for ARKit demos, meaning you can experience the powerful AR engine through an immersive head-mounted display. In fact, we went full-on Microsoft, and created our very own Obligatory Turbine-Thing demo…

For many people, this is an exciting new development that has enticed them into the augmented reality world; simple and yet powerful enough for anyone to focus on creating fun and original content, instead of worrying about how to set up tracker images – which is great. Some VR developers have even hopped straight over to ARKit simply to explore its’ abilities for 6-degrees of freedom positional tracking, and they haven’t been disappointed!

2018: Standalone VR, Affordable MR, and Robust AR will bring convergence closer than ever

So, if we look at the big picture at this point; it’s not hard to see that we are on the cusp of a real AR, MR and VR melting pot – with technologies overlapping, developers crossing boundaries and even content that swings from one format to the other with the touch of a button.

VR Hardware will be reinvigorated by the release of new standalone headsets – independent from PC units and yet more capable than smartphone headsets, they hold the potential for a renewed push to bring VR to the mainstream.

Also, dedicated MR headsets will soon be affordable and versatile – for business users and entertainment purposes alike.

Finally, AR is likely to retain pole position out of the three technologies in regards to general uptake, at least for a while.

For us, it’s very exciting that hybrid experiences and an ‘XR’ approach are finally gaining traction. Although, that is business as usual in our studio – looking at our portfolio is sort of like taking an ‘Is it VR, MR or AR?’ quiz. For the big players – we’re all expecting Apple to cosy up more with VR technologies, especially now ARKit is out there. For Google, expect much-increased interplay between their VR and AR ventures, as they attempt to catch up with Apple on the AR side.

Oh, and Google – We knew you were making a mistake when you filled in that camera passthrough hole on the Cardboard V2 and Daydream headset. If you could please bring it back for your next headset, we won’t have to use craft blades so often!


Zubr realtime mixed reality camera rig with virtual viewfinder

Introducing: Zubr VR Mixed Reality Video Studio

You might think that we spend most of our time developing AR and VR applications for smartphones – but we actually do plenty of experimental R&D with different cameras, scanners, VR headsets, inputs and physical setups. It’s all part of the philosophy of pioneering new ways of making content, and continuing to create accessible experiences with off-the-shelf hardware, which we then feed into our other projects.

One of these research projects is our ongoing exploration into a Mixed Reality Video Studio.

Meet Mixed Reality Video

Making a mixed reality video is generally considered to be the practice of creating a video which make it look like the VR user is actually placed in the virtual world they are seeing. 

As more and more studios create awesome VR content to show off, the popularity of producing mixed reality videos has risen sharply. For most purposes, a simple green screen setup and basic camera synchronisation will produce a pretty sweet video. But what about making something a bit more advanced, where you want to embed the video footage right into the middle of the scene, with virtual objects not only behind but also in front of the person?

Like with any video production, you can always push your video through a conventional post production pipeline, spend some solid hours compositing footage with foreground elements in Adobe After Effects, and end up with something which is broadcast-worthy. That’s fine, but it can easily become restrictively expensive, and, well, we’re not a video post-production house.

Depth test with Jack and Laika

Realtime leads the way! Again!

Which brings us onto the next possibility: Compositing video footage directly into the VR game engine, in real-time.

We’ve played with depth-sensing cameras such as the Microsoft Kinect since the beginning. We have Kinects, Zedcams, Realsenses and Tangos permanently scattered all over our studio, being put to use for game inputs, volumetric video capture, 3D scanning etc. So why not use depth-sensing abilities for compositing mixed reality videos?

Well, that’s what we thought when we started using the Kinect V2 for some early efforts at realtime MR compositing. However, the ‘bubbly’ noise produced by the depth image, coupled with the infrared interference with the HTC Vive, made the Kinect V2 a difficult choice for this. That’s why we started looking into Stereolab’s ZED cam – which calculates depth values from the separation between two RGB cameras.

Clearly, many people around the world are exploring similar ideas. Most notably, at the same time that we were getting stuck into it, the geniuses at Owlchemy Labs – a VR games company in Texas – blew away any expectations of what can be achieved with realtime depth compositing with the video clips and explanations on their Mixed Reality Tech blog. With their VR game Job Simulator being a massive hit across all the high-end VR devices, these guys have clearly made a priority out of finding an intuitive way to show audiences what it looks like to see a person immersed in their virtual scenes.

Anyway – the ZED cam does a very nice job of producing a realtime depth map. It isn’t perfect – part of the depth calculation is an algorithm which basically smooths/estimates some depth values. Notice the soft, blurred areas in the depthmap above. But for the most part, it’s very good. That is not to say it is in any way easy to make this thing work how we wanted it to – blimey!

So, we mounted this camera on a special rig along with a HTC Vive controller (for positional tracking, which keeps it synchronised with the position of the virtual camera), an Android smartphone (on which we render a realtime virtual viewfinder app for the cameraperson to see what they’re filming), and a game controller (for the cameraperson to control virtual zoom, exposure and lighting controls).

Our progress so far

  • Composites video directly in to the game engine in realtime
  • MR Camera Rig includes depth camera, HTC Vive tracker, input controller and realtime virtual viewfinder
  • The camera operator can adjust virtual zoom, exposure and lighting controls from the physical camera rig
  • The subject can both cast and receive lights and shadows from its’ virtual surroundings
  • The subject is correctly depth-sorted in the scene, even behind transparent and translucent objects

Current areas being worked on:

  • Our greenscreen/keying setup is a quick fix – this is the reason for the dodgy edges in the video, NOT the depth feed! So, we need to upgrade our keying facility, basically.
  • Lighting and shadows can be smoother
  • Calibration needs to be easier

Where to next

Increasing general reliability and flexibility of the system is a big step to conquer next before we can see how it works in some real, non-demo content.

Adapting it further for TV/Broadcast users is an important one for us. We want real, human-world camera operators to feel at home with our absurd camera rig.

Integrating a 3D-scanned face overlay to the mixed reality result, inspired by Google’s experiments, is a great way of making sure we can still see the virtual user – we’re working on it.

Expanding usage to non-VR – At the moment we’re focusing on how this solution applies to virtual reality content. However, in the near future, we will be expanding its’ use cases to meet conventional practices in broadcasting and the wider media industries (Think along the lines of the hilariously expensive BBC News Virtual Studio, but so versatile you can even use it in the field).

Interested in the Zubr Mixed Reality Video Studio?

We are interested in making partnerships in broadcast and media industries to help bring this solution to fruition. Please contact us if you’d like to learn more about the system, arrange a demonstration and perhaps work with us!


Zubr blog 1

Here we go 2017! Oh, and, fancy joining Zubr VR?

Zubr more than doubled in size and business activity, and we’re on track to do the same again in 2017. So, what have we been up to?

We worked on a total of 26 immersive media projects in 2016. That includes 15 Android-based VR/AR/MR experiences, 3 HTC Vive builds, 3 permanent physical installations, and 1 aerospace project so secret we can’t tell you anything about it. 

We created the World’s First fully automated 4D scanning installation, in which the scanners have been fired up more than 100,000 times since it was switched on. We produced and released an unparalleled Open-Source development kit for the Fulldome planetarium format, opening up a difficult format for everyone. We invested heavily in Research & Development, developing our own realtime depth-compositing solution for the television industry, pioneering realtime 4D scanning deployment for the performing arts, and devising a unique content management system for web VR experiences.

We have forged some brilliant partnerships with innovative organisations; becoming a close delivery partner with the At-Bristol Science Centre, an augmented experiences developer with Calvium, a WebVR collaborator with Yadda and a multi-faceted VR production team with Trainrobber and Wolf & Wood. Together with these awesome people, and many more, we have some huge, groundbreaking projects lined up for 2017.

So, could this be where you come in?

If you’re a Unity developer, technology tinkerer, or Jack-of-all-trades (which is a good thing by the way), and are interested in joining our team, we’d love to hear from you.

Don’t worry about having any experience of working on VR projects, we’ll soon fix that!

Just drop us your CV/portfolio with covering email to jack@zubrvr.com


Zubr Bjork Google Cardboard and web VR experience

An immersive look into Bjork's Future

Today, we’ve very proud to unveil a neat little project we’ve been working on with Yadda and Crack Magazine as part of Bjork in Focus. It’s a simple but powerful immersive environment where you can look around at some exclusive content from this month’s Bjork features in Crack Magazine.

To create the virtual environment, we created custom artwork to fit with the Crack magazine’s cover feature of Bjork –  including a seamless 360 cubemap and meticulously dimensionalised key images. The experience was built using Zubr VR and Yadda’s awesome new Enso Experience platform, which is a versatile, CMS-controlled system for creating great VR content that works right in the browser.  Crack Magazine also made  custom Cardboard viewers to go alongside the experience!

Head over to our Bjork’s Future project page to check out some screenshots, or click here to go straight to the experience itself (that’s right – no app to download).


Zubr mantelpiece

A Zubrnice Refresh

It’s been a busy year here at Zubr VR, and we’ve gone through a lot of changes – the most recent of which is the overhaul of our website (welcome, by the way). 

Recently we have been engaging closely with some fantastic people and companies around the world. We have built relationships with Trainrobber, Yadda, Calvium, Mimesys, Wolf & Wood and At-Bristol to name a few, and together with them we’re working on some really interesting projects. For example, our Test Lab project with At-Bristol is the first of its kind anywhere in the world; an automated system capturing incredible 4D scans of its visitors and rendering them as augmented holograms on the table in front of you. There’ll soon be some really exciting upgrades applied to this, increasing the quality of the scanning, and allowing visitors to see their holograms on the moon, amongst other places!

Kids feedback on Zubr We The Curious l augmented volumetric video VR Lab

With our fingers in so many pies across different industries (sorry we weren’t more specific, business advisors – we just really love making cool stuff for everyone), we’ve had to tackle a lot of research and development work to get to where we are now. But, this is great, as it’s helped us identify our six main areas of expertise. For each of these we have developed our own unique approach and set of solutions. This includes a number of innovations; such as our automated LAN timeline system for Test Lab, a CMS-controlled system for browser-based VR content, our compositor-based live action scene dimensionalisation (bear with me), and our method of hybrid 3D scanning techniques. I won’t get into too much detail, or we’ll be here all night, but check out some of the projects we have here on the website to get a closer look.