Deploying talentS makes beacons or wearables -talents had to wear to track their movement- obsolete, thus giving more flexibility and ease in moving in virtual studios. With AI algorithms and the power of Nvidia’s GPU Tensor Cores, talentS extracts the talent’s 3D location from the image with utmost precision. It sends the tracking data to Reality Engine to create accurate reflections, refractions, and the virtual shadows of the talent inside the 3D space. Broadcasters and studio operators can enjoy hyperrealism in their virtual studio and augmented reality productions, with the perfect virtual and physical merge. Designed by live production experts, talentS works 24/7 continuously without any interruptions. It sends data through industry-standard FreeD protocol and integrates with the Reality ecosystem and any other FreeD speaking platform, out of the box.
Moreover, TRAXIS talentS can be utilized in other applications besides virtual studio, such as augmented reality in sports and live events. For example, with talentS AR graphics for statistics can be enhanced above a boxer during a live boxing game. It can also enable robotic lights to track a specific dancer during live performance automatically.
Zero Density’s disruptive approach to talent tracking unlocks a new level of freedom inside the virtual space and more. It frees the individuals from external wearables, items, and beacons. Zero Density places the talentS system at the heart of the innovation and the future of live interactive production by harnessing the power of machine learning and taking advantage of the advancements in GPU technology.