The present article outlines how the Network Device Interface (NDI) SDK adds Interactivity to live VR video footage produced for Hermitage Museum in St. Petersburg.
Focal Point VR believe that VR offers an entirely new way of experiencing the real world, which is why they are passionate about creating the technology necessary to realise the dream of people being able to put on a VRheadset and then be ‘teleported’ to another place. Consequently, Focal Point VR is a leader in the field of live stream VR video and provides solutions to live stream VR content in sport, music, art, education and culture including The Gadget Show and the Champions Tennis at the Royal Albert Hall. Its UHD broadcast quality VR streaming platform called Ubiety, delivers the highest quality VR live streams to all platforms. The Ubiety software platform supports everything from delivering an HD 360° stream to YouTube, Facebook and Periscope through to multi viewpoint VR event coverage delivered to customisable VR apps and html5 players.
Ultra low latency streaming
In April 2018, Room One, an AR, VR and AI platform provider, approached Focal Point VR to provide a 360° video camera rig, live stitching and streaming solution for a 5G trial zone installation at the Hermitage Museum n St. Petersburg, Russia. The application was a future technology demonstrator that mixes live stream immersive VR video with an interactive haptic setup controlling remote robot arm. Focal Point VR’s standard solution met 90% of the client’s goals but needed a low latency streaming solution suitable for the interactive environment. s a result, the team at Focal Point VR startedlooking for a robust alternative to RTSP for its ultra low atency live VR video solution.
Adding NDI to the pipeline
After looking at a number of protocols including WebRTC, the programming team found the Network Device Interface (NDI) SDK easy to implement with initial testing showing NDI to be stable and worked well at the ultra high resolutions the team used. NDI is a royalty free software standard that enables video compatible products to communicate, deliver and receive broadcast quality video in a high quality, low latency manner that is frame-accurate and suitable for switching in a live production environment. By incorporating NDI into its VRvideo live production workflow, Focal Point VR provided its livestream processor (the FP-A6 model) and customised NDI playback using Oculus Rift headsets. Running over a GigE network, the camera lens to headset display latency was less than 200ms enabling real time interaction with the remotely controlled robot.
Technologies of Tomorrow
More recently, Focal Point VR has been approached by a number of clients looking for a low latency, high reliability live stream to a 360° dome providing a shared VR experience.In order to meet this demand, the programming team have developed a custom 8K cylindrical 360° video solution, which pairs with Focal Point VR’s proprietary packing technology with NDI running over the internet to deliver the fully immersive experience.
Jonathan Newth, CEO at Focal Point VR, said, “Integrating NDI into our 360° workflow was a very simple task and it has allowed us to deliver a higher quality, more flexible solution to our clients. We are also talking to NewTek abouthow to integrate NDI more completely into our multi camera VR workflow, replacing traditional fibre SDI with a more flexible network backbone and we are looking forward to continuing our work with the NewTek team.”