The Future:

It’s Whatever You Want It To Be

FKT Magazin 1-2/2019

Back in computing time, implementing any kind of functionality in software was to be avoided at all costs, such was the performance penalty compared with implementing the functionality in hardware. But times have moved on. We now have more computing power – processor performance,  memory size and speed, ASICs, FPGAs and GPUs – than we know what to do with. That‘s brought software back to the fore – system performance is no longer constrained by the inadequacies of the underlying hardware – because it brings numerous advantages. But: the importance of hardware should not be underestimated. This article will review how the best systems solutions leverage both the flexibility of software and the power of hardware.

In a world where the rate of change is not only relentless, but accelerating – whether that’s in technologies, markets or customer requirements – software holds the  key when it comes to responding to those changes. That’s  o less true for the broadcast industry.

Historically, ours was an industry that relied on proprietary hardware, optimised for specific applications. There can be little doubt that, for a long time, that  paradigm worked well. No, those solutions weren’t cheap, and customers found themselves locked in to one or more vendors – but in an environment that was, broadly,stable, that didn’t really matter: the investment could be   mortised over several years.

But: over the past few years, there have been twochanges of huge significance. The first, of course, is that  the broadcast industry has undergone – continues to undergo – a period of huge upheaval, driven by changing consumer content consumption patterns and ubiquitous high speed connectivity.

Unrelenting rise
The second has been in technology. The advent of IP has  been transformative, and will be increasingly so – and it has been accompanied by the unrelenting rise in thepower of general purpose computing. So-called ‘COTS’  – commercial off-the-shelf – hardware platforms are now capable of delivering hitherto undreamt of levels of performance, at prices that are driven by the growing commoditisation of technology.

The parallel emergence of a rapidly changing broadcast landscape and new levels of hardware capability   is, to say the least, serendipitous. The industry needs to position itself for an extended period of tumultuous change – and the technology exists to enable that to happen relatively straightforwardly.

It has one significant implication. The broadcast industry of the future will be software-based, from the lowest hardware level to the highest application level. 

At the highest level, almost any functionality youwill need will be delivered by software – delivered via  industry standard servers, platforms, interfaces andinterconnects. That’s been an emerging theme of IBC in recent times. It has long been a mantra that companies buy solutions, not computers – and now, the nature of those solutions is changing forever.

We’ve seen the rise of so-called ‘microservices’ –  compact, single-function software components tuned to perform a specific media function. These create a ‘pick and mix’ approach that allows customers to choose only the functionalities they need – but the microservices are engineered to work seamlessly together, in whatever combination.

What’s less widely appreciated, though, is the growing role of software at the lowest levels in the computing hierarchy. Take network probes, for example, which are at  the heart of our business.

Not all software is as visible as, for instance, our  Instrument View GUI that enables many of our probe users to interact simply and intuitively in real time – from wherever they are in the world – with the network data our probes provide. Much of our software development is  invisible to the naked eye.

Not always comfortable
Although we’re perhaps perceived as a hardware company,  the truth is that we are primarily a software company – and always have been. That’s not always a comfortable place to be, not least because there is a perception that software is inherently unstable.

Windows, for example, is often derided for its alleged instability. From that point of view, it’s compared unfavourably with Apple’s Mac OS X. But here’s the thing. Apple has absolute control over the hardware environment in which OS X runs. Microsoft, on the other hand, has almost none. While OS X is a fantastic achievement, it could be argued that Windows is an even greater achievement because it runs so well on so many different platforms. Software, in general, only fails when it relies on hardware to do something – and the hardware doesn’t do it.

The key – for Bridge as much as for Microsoft – is to  abstract software from theunderlying hardware level to the maximum extent possible  – to make it as hardware-agnostic as possible - not least because that delivers the portability that enablesa solution to be simply migrated from one platform to another with enormous benefits in terms of lower cost, easier upgrade and improved scalability, for example. That’s a huge challenge because, historically, we have relied on hardware to provide key hooks – such as timing   – that underpin the performance and functionality of the software.

Among the most daunting of those challenges is the  challenge of maintaining absolute accuracy. Our network probes have to be incredibly precise in order to deliver meaningful information in an environment in which, for example, a standard HD ST2022-6 SDI packet streamwill deliver 270 packets every millisecond: that’s one   packet every 3.5 microseconds. We’ve now managed to move ourselves to a position where our software enables us to achieve what we once relied on hardware to do.

Strong foundations From that point of view, the end application can be likened to a building: without strong foundations, it will be prone to failure. That’s why, at Bridge, we’ve spent a huge amount of time crafting each individual underlying  ‘brick’ of code – to deliver the absolute reliability, repeatability and stability that our customers rightly expect from us. It’s an endeavour that’s not for the faint of heart – but we’ve spent 14 years doing it and, in all modesty,it’s something we’ve become very good at.

And, as we move forward, we meticulously revisit all those legacy blocks of code to find out how they can be improved. We’ve had significant success in that endeavour, achieving  significant performance increases as we find ways of doing things better. That initiative has been of enormous benefit to our customers, because we’ve been able to provide regularperformance upgrades to them – and because we’ve ensured that our code is independent of the hardware, they can pretty much all take advantage – whether they’re VB120, VB220, VB330 or VB440 users. Software is unique in its ability to provide such pain-free, affordable enhancements  to an existing installation.

It’s also the key enabler for some extremely exciting new network technologies. The hardware exists to sustain  switching speeds that are orders of magnitude beyond where we are today: it just needs the appropriate software to be developed.

There are already great examples  of the advantages that software can bring. Take Remote PHY, forexample, where software has enabled  complexity to be centralised, minimising the cost of deploying arge networks. Or: what about virtual radio, where a similar  software-based approach has transformed the viability and economics of the medium?

Numerous advantages
The fact is that, back in computing time, implementing any kind of functionality in software was to be avoided at all  costs, such was the performance penalty compared with  mplementing the functionality in hardware. But times have moved on. We now have more computing power - processor performance, memory size and speed, ASICs, FPGAs and GPUs - than we know what to do with. That‘s brought software back to the fore: system performance is no longer constrained by the inadequacies of the underlying hardware, and software brings numerous advantages in terms of flexibility, upgradability and scalability that have become essential to how we deal with a future that, if not uncertainor unknown, has certainly become harder to predict.

If there is a small cloud on the horizon, it’s that we can  be very conservative. That’s hardly surprising: we’ve built an enormously successful industry based on technologies and platforms that have worked well for us and that we trust to deliver. The good news is that software, written as it should be, is no less trustworthy. Few deny that the world is changing, and we need to change with it. Whether that means  mbracing IP, or looking forward to a future in which almost anything we desire can be delivered quickly and effectively by software – it’s good to know that we have the technologies we need in order to continue to succeed.

Ähnliche Beiträge