Moving Live Production to The Cloud

  • With TAG Video Systems

Moving Live Production to The Cloud

We’re now at point where live production is entirely possible in the cloud. Any questions?

Live production is the highest stress, highest anxiety environment in media and entertainment. Even with the highest budget feature film a director at least has the luxury of retakes. When you are live on air there is no going back. ‘We’ll fix it in post’ is not an option. Produced in realtime, the show must go on.

Live is also a highly complicated system involving multiple elements from best of breed vendors ranging from graphics and replay systems to slo-motion, VFX, DVEs and the mixing of external live sources. Typically, the production requires a control room full of people in a bricks and mortar facility or an OB truck. There are racks of gear and miles of cable. You can configure it to some degree by assigning different signal paths to the routing fabric but once built it stays wired handicapping the ability of the broadcaster to respond to changing business needs.

All of this is now evolving into the cloud which fundamentally changes the nature of everything at the same time as not changing the fundamentals of live production at all.

To unpick that conundrum, let’s take a minute to examine the evolution of live production into the cloud. The evolution of live
To build your show, you start from various camera sources. These are ingested into your studio facility for the application of various processes: live switching, replay, slo-mo and graphics being some. The entire production in the control room is centred on a multiviewer which offers a series of visual and monitoring solutions to aid the director and vision mixer, the video operations team, for camera shading, replay ops, audio mixer, producers and so on. Broadly speaking this live production architecture hasn’t changed for years and doesn’t have to change as we move to cloud. So why move to cloud at all?

Cloud computing is essentially the use of remote computing resource to store, manage and process data rather than use a local (on-prem) hardware. The advantages this brings to media production as a whole, and live in particular, are clear and multi-faceted. Broadcasters and media service providers want to move into the cloud for flexibility and speed.  In order to provide services that spin up and down the underlying model needs to be usage based. The long integration lead time of on-prem solutions, involving capital approval cycles and slow software deployment, is an impediment to providing the agility needed today to meet rapidly changing business requirements and opportunities. 

On-premises live production systems are expensive and are often used only a fraction of the day.  Media companies had been moving to a remote production or REMI model the past few years to improve the ROI on this CAPEX investment and improve on staff productivity. With COVID, having these large production teams working in the same control room became unfeasible, and remote distributed production with an at-home workforce became essential. With everyone working in this distributed environment, the cloud was a natural fit. It avoided the risks of all the feeds being home-run to a single facility with limited staffing and resilience and allowed the production system to be scaled to the event’s requirements.

Being able to scale the system size to the event requirements created operational efficiency. For the first time with the shift from CAPEX to OPEX in the cloud, the true costs of the underlying production system could be attributed to the event. It is this deterministic behaviour of systems in a cloud environment that means broadcasters can finally predict exactly what operating a service, or launching a new one will cost, and drives operations to manage these costs more effectively than before Add to that a microservices approach to development and deployment and broadcasters can upgrade equipment and introduce new features far faster, more economically and more flexibly than before.

The value of ST 2110
From the analogue era through SDI live production has used baseband video and baseband audio simply because it’s the highest quality. It gives the best production tool to work with so we don’t have to go through generations of processing. It’s also the lowest latency because in live you want to interact with your studio and your performers. None of this changes in the move to IP.

SMPTE standard ST 2110, for which SMPTE and co-developers VSF, EBU and AMWA have been awarded a Technical Emmy, mirrors SDI in providing for uncompressed video and precision timing.

With ST 2110, you can do everything from SD all the way up to 8K RGB 12-bit uncompressed, and the standard even supports signals up to 32K x 32K for the future. Regardless of resolution or other signal characteristics, you are using the same protocols, the same switching, the same standard IT architectures.

For just about any piece of kit you could need in a TV facility - multiviewers, cameras, replay systems - there are solutions available whose primary interface is ST 2110.

Mezzanine compression to cloud
ST 2110 is built around the notion that bandwidth is free and cheap on campus but to get to and from the cloud, and especially for any form of remote distributed live production, a form of mezzanine compression is used.  Mezzanine is a lightweight compression that squeezes the bits sufficiently to save you bandwidth but not hard enough to destroy the quality needed for picture manipulation, like chroma keying, in the production process. The leading schemes are JPEG 2000 (J2K) and JPEG XS. The latter exhibits extremely low latency which is important for live. With detailed specs for mapping JPEG XS into a 2110-22 ecosystem now complete, vendors are adding JPEG XS capability to product.

Mezzanine encoding and decoding is part of the suite of core services required for live production that can be hosted in the cloud today. Format conversion, reliable secure transport, multiviewing and probing, the production switcher, CG and playout are also primed for cloud.  Other applications such as replay systems, slo-motion, intercom, even lighting control and the CCU will all, eventually, be available as a-service.

Some of these applications can be bought from the cloud vendor to drag and drop into your design in the cloud. Other pieces, such as the production switcher, playout and multiviewer are highly specialised and will likely come from best of breed developers. Even with the recent emergence of fully integrated live production systems in the cloud, the complexities of many live productions will require additional components and services to be added to support the required workflows.

Cloud Benefits
The important thing is that the functionality of this equipment and how the content flows from ingest through production to output doesn’t fundamentally change. But with an architecture underpinned by IP and cloud the business benefits are transformed.  

Remote distributed production becomes simplified because all cloud production services inherently support remote operations, minimizing the need for production staff to work in a common control room. Costs are aligned with production requirements and complexities. Systems are highly resilient, and redundancy is possible, something that seldom exists with on-premises systems. 

Perhaps most important, systems are no longer static, accelerating the ability to add in the latest technology and effect and create compelling live production.

Once upon a time, repurposing the studio meant studio downtime, complex rip and replace installations and excessive capital outlay. Testing a channel idea took weeks if not months and the science predicting return on investment was wildly inexact. Now, with a studio in the cloud, you can simply try a production out by reassigning applications with a few clicks. You don’t have to build a room and commit rack space. Chart the costs per hour and match metrics around traffic flows, bitrates and reliability closely to KPIs.

There is no technology reason a live production pipeline cannot be built entirely in the cloud. The benefits are compelling:

  • Cost - Rent resources as you need them.
  • Redundancy - Never has cloud been more relevant for disaster recovery than today. You could even use two different cloud providers so in the unlikely event of cloud failure you are still on air.
  • Scalability - You need 10 cameras today and 40 tomorrow? No problem. Dial up what you need on the day of demand.
  • On-the-fly capacity adjustment - Get what you need and always optimise your cost.
  • High availability - The SLAs are at least as good if not better than your own TV plant.
  • Flexibility - A news event happens and you can pop up a channel or studio to cover it as the news breaks.
  • Geographic insensitivity - Your operators can be anywhere. Literally.
  • Reduced gear at venue - High resolution remote operated cameras can cover an event from all angles.
  • Flexible/and potentially reduced staffing - With fewer technical and production crew needed at the venue or dedicated to one production, they can be assigned to cover multiple events in one day.
  • No control room necessary - Remote production may have been forced on the industry but it has demonstrated categorically how effective collaboration can be in virtual workflows.
  • Enriched production - Incorporate downstream interactivity such as Zoom to enrich your live show with ease in a cloud native environment. 

Where next? 
We’re now at point where live production is definitely possible in the cloud but there is still complexity. For broadcast engineering teams tasked with moving to the cloud there are lots of questions to explore.

How to actually get your live production to the cloud? How to move streams to and from the cloud in a safe way? How to pass your video streams though video components on the cloud itself? How to calculate and compare the cost of video services needed for live production with those on-prem? TAG’s Cloud 101 webinar series
For guidance on these and more look no further than TAG’s latest webinar series, Building Media Systems in the Cloud.

It’s comprised of five sessions, first of which Why the Cloud? is led by TAG Director of Corporate Strategy Peter Wharton.