05.12.2018 | Ausgabe 12/2018

Service Oriented Architectures Are The Foundation Of Flexible Video Production

MICHAEL GROTTICELLI is an experienced  editor and regular contributor to   KT’s Tech AcrossAmerica column. / Quelle: M. Grotticelli

MICHAEL GROTTICELLI is an experienced editor and regular contributor to KT’s Tech AcrossAmerica column. / Quelle: M. Grotticelli

By now it’s clear that software is more than capable of handing all types of audio and video production; long-simmering debate settled. Be it in the studio down the hall or at a live sports event 3,000 miles away, virtualizing common production and distribution tasks (video switching, graphics insertion, ad insertion, etc.) makes economical sense. Therefore I’m declaring 2018 as “the year of virtualization and the Service Oriented Architecture (SOA)”.

This was the year when the video industry finally woke up to what the IT world has known for decades: how to move large amounts of data reliably and quickly. The video industry just had to get the timing right, and  it did.

The year started with a lot of experimentation and trials of IT-centric technology strategies that extensively leveraged software to produce and distribute such major live events as the Olympics and the World Cup. Both are world-class telecasts watched by millions and more than ever before both events relied on software to get the signals into consumersliving rooms. Both events also proved that software versions of heretofore hardware- reliant devices and systems were up to the task.

Among the many advances this year in live production technology and workflows, by far the most significant is the emergence of the virtualized  emote production network. Some call them“At-Home” or REMI (remote-integration model)  productions, where most of the physical technology is not on site at the sporting event to be televised but ata remote facility. Camera signals from the event are  sent back to this facility where they are intermixed withgraphics and commercial advertising (some served in specific regions directed at individual viewers (via anIP address) and then sent on to consumers’ homes ormobile devices.

This would not be possible at such large scale without the use of a dedicated suite of software, a bit of interconnectivity via secured and unsecured networks (the internet), a single operator to monitor disparate feeds, and a variety of software-enabled orchestration and control systems.

Most software architecture implementations used thus far have included the use of a SOA software design, allowing virtualized media functions to be dynamically connected between one another for live remote production workflows. Micro servers (really,  hared-memory multiprocessors, or SDMPs) and their virtualized functions can be strung together within a  ingle server or between multiple racks using the IP  protocol to connect them all. In this way live video can be reliably distributed to a central (main) broadcast center and on too many platforms simultaneously, without introducing delay.

The benefits are clear. Deploying a software-based workflow—instead of traditional hardware-only or hybrid architectures—enables broadcasters to turn on or off  different functions as needed. This allows broadcasters to experiment with new types of tech-driven programing without the burden of a large equipment budget.

As an example (and there are many others), a company called Aperi markets a software-defined platform called V-Stack, which includes an FPGA-powered software platform that, it says, provides more compute power than CPU- or GPU-based processing. Optimized for live production, they say it provides a much faster and more agile remote production with lower latency. An Aperi remote production network can be deployed on either generic FPGA-powered servers or Aperi’s dedicated edge servers. With the V-Stack at its core, the platform enables the immediate start and stop of broadcast functions through apps that are accessed via tiered licenses. The platform is also based on container- based technology (with automatic discovery and registration proven in the data center software world), removing the need for manual processes and administration or field engineers. At this year’s IBC Show, Aperi staged a demonstration of its platform, showing how it can reduce seconds of latency to just milliseconds. In 2014, ESPN opened  its Digital Center 2 (DC2) in Bristol, Connecticut. DC2 is a $125 million, 190,000-square foot broadcast facility with a software- intensive infrastructure that can handle 60,000   imultaneous signals and 46 Tbps data throughput. While the project required a massive feat of engineering and significant costs, ESPN found long-term benefits from the reliability, flexibility, determinism, power, and simplicity of the software systems they deployed.

Among a number of benefits, the architecture as designed includes an internal Broadcast Audio Monitoring (BAM) service that can listen to 32 streams on each panel. In addition, it can keep track of all the devices talking to the network. As a result, BAM can monitor 16,000 streams at once to check status, analyze the type of stream, define who can connect to what, and reach out to connect if streams stop. The software defines the available streams for the hardware, while each node stays aware of the connections as defined by the software. This makes it easy to reestablish a connection if a signal is dropped.

Businesses from manufacturing to health, insurance to telecommunications, have adopted the SOA model. The Broadcast industry is catching on fast.

The adoption of SOA leads to a more agile organization Perhaps the biggest clue that software architectures are here to stay is found in the hiring of new staff. All new applicants to ESPN and any other major media organization must have an extensive IT background and a close familiarity with software. That’s because scalable, agile, format friendly software-centric architectures are the key to it all.


Newsletter
Ja, ich möchte den Newsletter von FKT abonnieren