Mixed reality with compound documents?


There is something quite exciting which i dont think most people have spotted. VR is nice and getting alot of attention recently. In theory, webvr could be more flexible that other vr engines because it’s the web; the browser’s are use to compound documents. While the other engines have to build all this stuff into their engines. My guess this might be the 2nd or 3rd reason why GearVR supports WebVR? It kinda already works in the lastest Firefox & Chrome, even works on Android Chrome (works for me without the dev version of Chrome, try it yourself)? Although polyfill might have some use here.

The nature of a compound document can enable multiple types of technology to be triggered from the same source. Say for example I could mix webvr with the geolocation api, orientation api and the webaudio api. Can you imagine the crazy experiences you can build with those things? I believe the JavaScript can run at the same time as everything else but this might be browser dependant.

It’s still a bit of a theory, but this could enable true mixed reality or (VisuoHaptic Mixed Reality if you must). Not that crappy stuff you currently see. Still like to see magic leap myself but to be honest Project foxeye interest me.

I’m looking to research this area with a bunch of smart people…

Author: cubicgarden

Senior firestarter at BBC R&D, emergent technology expert and serial social geek event organiser.