Long before the pandemic hit and live sports production came to a complete halt, broadcasters and production companies had been experimenting with new methods to produce content in the most cost-effective way. This has spawned the concept of REMote Integration (REMI) or “At Home” models that employ less equipment and crew on site and can logistically cover more events using an IP infrastructure.
Today, with the mandated lockdown easing, remote production seems destined to be the way most live production is accomplished, due to fewer crew and travel requirements. With live sports slowly resuming play, the number of remote productions being done versus three years ago is sure to be significantly higher.
In fact, At Home productions have significantly increased over the past two years, enabling producers to cover multiple venues for the same “Tier-1” event and also many “Tier-2” sporting events that would overwise not be covered to due to cost (and lack of advertiser support). With each new remote project, new lessons are being learned and systems infrastructures tweaked to make the most of available resources.
However, when it comes to remote production methods, there’s no “one size fits all” but the endgame is the same for all: Maximizing resources and eliminating costs. Across the industry, three types of remote production workflows have emerged as the most effective, although each has successful for different reasons – including geography, budgets, and bandwidth availability. There’s an uncompressed model, a compressed model and adistributed workflow. [Of course, there are more options and a myriad of ways that people split up their resources.]
The Uncompressed method is considered the most ideal, due to signal quality, but it brings the cost of sending uncompressed signals (that’s 12 Gbps for 4K) back and forth between a hub facility and the remote site. Camera feeds are sent straight from the camera head over IP to some sort of production hub. In this scenario you send only the camera to the venue and the signals are sent back to a base station at the hub facility via fiber.
This requires consistent 3 Gbps and higher bandwidth, which can be tough to get in the last mile (from the “home” production facility to the remote site). So, if you’ve got 10 cameras, that’s a lot of required bandwidth, which is not typically realistic in today’s budget-conscious world. Most stadiums don’t have those types of connections anyway, so it’s that last mile that is the most challenging. Even with its big budgets, producers of the Olympics are challenged each time with procuring available bandwidth.
Certainly there are places where that type of high-data-rate bandwidth is available, but it’s not common – requiring you to secure a satellite or dark fiber connection. However, this uncompressed method has been used extensively in Europe, where there’s a lot more public support for higher bandwidth. In the U.S., users tend to hire that bandwidth for the specific time period required. Therefore, due to bandwidth availability, uncompressed remote production is often easier and less expensive to produce in Europe and Asia than it is in the U.S. or South America.
Compressing the signals before they are distributed to the hub facility means lower data rate (and less cost) requirements. Signals are sent into an encoder at the remote site and then decompressed at the hub facility. This method introduces a bitmore delay, due to the compression/decompression process, but it’s usually not more than a few frames.
To compensate for the delay, Precision Time Protocol (PTP) technology is used to synchronize the signals. Most viewers won’t care about the delay. The challenge is for the production people, who are looking at monitors that are not synchronized. The monitors at the venue are often ahead of the hub, but if there’s a round trip from the hub facility, then the monitors are behind.
This REMI method was used at this year’s FIS Ski World Championships in Sweden in February. The action was captured with 80 Grass Valley HD cameras and a production switcher on the mountain in Åre while the signals were sent back (and forth) to Stockholm, about 600 km (372 miles) away – with redundant 100 Gbps connections – for final processing.
In the Distributed Production model, producers are taking some of the physical equipment to the venue and performing the processing at the remote site, but you are leaving the control elements at home. For example, a production switcher frames lives on site, but the panel is remotely located at home. Replay systems could be set up this way as well.
The advantage is that you are still leaving people home, but you are able to process more quickly on site. The control signals require much less bandwidth than full video signals. That makes it easier to send signals back and forth. This also reduces lag time in the stream and cost.
Getting the right infrastructure to connect to the stadium is among the biggest challenges to remote production. He said they see it as a multi-dimensional problem that includes bandwidth (1, 10 or 100G), latency (which equipment can be centralized and what needs to stay on stadium), type of production (number of cameras, slomo or not, archive, etc.,) and frequency of the event – like producing a related series of events or one major event like the Olympics, FIFA, etc.
The cloud is yet another off-site processing technology that is being experimented with, but there are issues of cost relating to getting content into and out of the cloud.
The industry is in a major change to IP now, prompted to move faster, no doubt, by the pandemic, so production companies will continue to refine remote production infrastructures to get the job done. It saves on equipment and crew costs and supports a safe, controlled environment to protect against the virus. There’s this huge demand for content and there’s no way companies can send everybody out to every site. We’re seeing the future of live sports and entertainment production today and social distancing is an inherent benefit.