Crystal balls: Phabrix looks at 2021 innovations and forward into what is coming in 2022

By Prinyar Boon, product manager, Phabrix

2021 has seen several technologies that have been in gestation over the last few years burst into the mainstream, including cloud production, distributed production, HDR, extended reality (XR), and JPEG XS with SMPTE ST 2110-22.

The most visible of these was HDR with both the delayed Tokyo Olympic Games and the UEFA European Football Championship being shot in UHD and HDR with many different trucks and crews from different countries being used. NBC Universal was presented with a special award for its work. Now I should give a behind the scenes a shout-out to NBC’s Chris Seeger and Michael Drazin for proving the workflow worked, and Pablo Garcia of Cromorama and Nick Shaw of Antler Post for their work on the LUTS.

In the UK, 2020 saw Sky launch its HDR service as well as the HDR enabled Sky Glass streaming TV.  The UK is now lucky to have both BT Sport with its Ultimate service (launched in 2019) and Sky regularly producing in UHD HDR and Immersive Audio.

We are now also seeing regular HDR production in Germany, and NVP and EMG provided the first HDR coverage of a UCL match in Italy for Sky Italia and Amazon.

Innovate and evolve

Not to be outdone BT Sport continued to innovate and evolve the distributed production system and Remote Operation Centres (ROCs) that were first put together as part of the 2020 project restart.

These are now used routinely for all productions, and this culminated in two notable events, the first being the UEFA 2021 Champions League Final where the UK coverage involved the first international distributed HDR production with “remote surface” of the vision and audio desks in the truck in Porto from the UK. Contribution feeds to the UK were HDR and the SDR was derived locally in the UK in conjunction with closed loop shading in the truck in Porto.

The second was the UEFA 2021 Super Cup final in Belfast with UEFA approving for the first time a host broadcaster using distributed production and a single HDR based production with SDR being derived from HDR using closed loop shading.

There is a sustainability and equality dimension to remote production that should not be overlooked and is important to the future of our industry. In the UK this year we have seen a worryingly low intake of students for broadcast engineering-related courses. The classic OB involves a substantial team traveling to each venue year in year out on an increasingly punishing schedule. There is now an increasing emphasis on broadening diversity (both in terms of geolocation and gender), reducing the carbon footprint of an event, and overall improvements to staff lifestyle, travel times, and time away from home. After the COVID hiatus, broadcasters are once again supporting the gender diversity advocacy group Rise and the Rise-Up schools outreach programme.

The next phase of work for HDR will be to harmonise operating practices at the national level so that crews will already be up to speed when they gather to cover these large international sporting events. What is remarkable is that the first live HDR experiments only started in 2014, which by historical industry standards has been a short runway, yet it still seems a long time given the rapid pace of other activities.

Alphabet soup of protocols

The current generation of distributed production makes extensive use of ‘remote surface’ where equipment is still sent to the venue but is then controlled remotely, with operators being provided with remote feeds of multi-viewer, tally, and intercom. The next phase will be to move equipment out of the truck and backhaul the camera feeds, but this will need low latency, high quality compressed links.

2020 saw the ratification of the JPEG XS standard and accompanying IETF RFC 9134, and the first deployments with SMPTE 2110-22. Riot Games won the IBC 2021 Innovation Award for content distribution using JPEG XS. Equipment suppliers are undertaking JPEG XS interop testing to ensure successful programme exchange. Notably the new GV LDX 150 camera natively supports JPEG XS with 2110-22, as well as 16 stops of dynamic range and sensitivity specifications for UHD that were previously reserved for HD sensors.

There is an alphabet soup of protocols that are alternatives to JPEG XS and 2110-22 but some notable examples are NDI, SRT and LRT from LiveU, many of which use Automatic Repeat reQest (ARQ) and higher compression ratios. The NDI 5.0 Bridge has set the bar for ease of connectivity via the cloud with an ecosystem of natively connected devices and converters, and LiveU have their Matrix cloud video platform. Blackbird and Mavis Broadcast have recently teamed up to offer a low latency cloud based live event production and editing platform.

NVIDIA is busily working to adapt its Rivermax Linux SDK for Windows 10 with Rivermax Display running on a suitable GPU and Bluefield-2 DPU. This means that any Win 10 platform can become 2110 enabled with a high quality PTP lock. Under the Windows Device Manager a virtual 2110 display is created into which the desktop can be extended. Any Windows application can then output to that display and become a 2110 source.

At Phabrix we are working with NVIDIA to test this technology; I have vMix and YouTube playing from a PC as a 2110-20/30 source on my desk. NVIDIA and IntoPIX also demonstrated JPEG-XS running on GPU compute during the GTC Spring Conference. You can easily imagine these technologies converging in 2022 to create a really easy to use offering.

In the mid-term, expect a comprehensive suite of building blocks that will enable virtualised cloud-based ecosystems that will address the concerns that 2110 is not suitable for software applications or cloud computing.

Connectivity options

Overall, we now have an emerging range of connectivity options that are well suited to different applications and niches as well as emerging viable cloud-based systems. What the broadcasters now want is a single control plane that works seamlessly across this mix of technologies.

Under the hood these systems are complex, and there is a rapid learning curve as people re-train. In some ways, we are seeing an increased need for T&M, and the associated technical expertise, to help debug and deploy these new live distributed IP-based systems with hybrid on and off-prem cloud systems, but we still have SDI as well.

Back in the studio we’ve seen a huge level of interest in XR, 3D augmented reality graphics such as Viz Arena, game engines, virtual sets, LED walls with live presenters, and artificial intelligence (AI) used to train 3D camera tracking systems.

Riot Games utilised 44 UHD feeds from 19 servers to drive an LED wall set. Again, deep down the technology stack, many of these systems use NVIDIA Rivermax and NICs and at Phabrix as part of the work we’re doing with NVIDIA, we’ve adapted our QxL products to support RGB 444 12 bit in 2110-20 for these applications.

In the live environment, the overarching theme is that equipment is now part of a network, and this means being able to interface to many different systems and network architectures. Customers expect and demand interoperability and remote access to equipment. We now live in a hybrid world which is a mixture of conventional SDI, IP equipment, and cloud. Customers now must work across all three technologies, but for sports production we still need live interconnect.

Subscribe and Get SVG Europe Newsletters