Aktuelle News & Schlagzeilen

WePlay Studios transforms future of live event storytelling with AJA gear

By 2032, the esports market is expected to grow to $9.29 billion, bolstered by a global player count and fan following that both continue to climb. The 2023 League of Legends World Championship Tournament alone drew in an audience of over 6.4 million.

 

Considering esports’ massive fan following, many of whom tune in via dedicated live streams, the pressure to deliver live productions that align with fan expectations is real for companies like WePlay Studios. With dual headquarters in Kyiv, Ukraine, and Los Angeles, California, the content production company fuses gaming, technology from companies like AJA Video Systems, and storytelling to craft gaming shows and esports tournament experiences for top-rated titles like Dota 2, CS:GO, Valorant, and Rocket League.

 

But, as WePlay has uncovered, there’s also a huge demand for expertise in live event productions like The VTuber Awards, which WePlay’s team completed a full virtual production for last year. Hosted by Filian in partnership with talent management agency Mythic Talent, the five-hour event celebrated the top virtual creators online. WePlay Studios helped bring the event to audiences worldwide with a virtual broadcast by blending physical production facilities and equipment with extensive virtual production engineering and design.

 

“Storytelling and technological innovation drive every show we do, and we pride ourselves on creating iconic content that leaves a lasting viewer impression”, says Aleksii Gutiantov, Head of Virtual Production. “While we’d previously incorporated AR into live esports productions, this show marked our first foray into a fully virtual event managed with virtual cues; it’s the most challenging technological endeavor we’ve ever taken on.”

 

To successfully pull off the event, Gutiantov managed and coordinated the production in Los Angeles remotely from his laptop from Europe, using intercom communication with more than a dozen team members and orchestrating eight days of pre-production to deliver the broadcast. His team first created a real-time rendering of an entirely virtual Filian to incorporate into the live production using motion capture (mocap) technology. They tapped twenty witness cameras for full-body performance capture, including precise finger movements, and combined it with additional technology to stream facial mocap data.

 

The live event stream included a vast virtual arena, but Filian’s character was located on a smaller stage, encircled by a digitally reconstructed version of WePlay’s physical LA arena. To ensure every physical pan, tilt, and focus pull translated directly into the virtual render environment, WePlay Studios’ camera operators managed three cameras that were synced to virtual cameras. Camera operators in the practical/physical set were then able to switch among various angles within the virtual stadium using iPads connected to virtual cameras, creating the illusion of using a dozen cameras instead of three.

 

To make the production look more authentic, WePlay Studios connected the physical stage lights to the corresponding virtual lights, which allowed the team to manipulate the virtual stadium’s lighting environment through the activation of a real environment via a lighting control console. Video playback was also integrated into the virtual world, with software for live event visuals connected to the virtual venue used to launch and control the graphics displayed on the virtual stage’s screens. AJA Kona 5 video I/O boards played a crucial role in the 12G-SDI signal chain, and the final SDI feed was forwarded to an AJA Kumo 3232-12G video router for availability across the entire broadcast pipeline.

 

“Our Kona 5 cards were instrumental in allowing us to receive 12G-SDI signals, integrate them into an Unreal Engine 5 environment, and composite the final in SDI”, says Gutiantov. “And, our Kumo routers let us build infrastructure for large remote and on-site productions like this one and manage everything from a single web interface thousands of kilometers away.”

 

Kona 5 enabled WePlay Studios’ team to use Unreal Engine to create a comprehensive virtual production hub capable of handling 12G-SDI workflows. This allowed them to fully harness the potential of AR technology, from camera tracking to motion capture and data-driven graphics, while ensuring live virtual production broadcasts without technical mishaps in compositing. It also allowed them to produce UltraHD fill and key signals from one card in all known formats, using Pixotope as a keyer for 4K with the failover features known from FHD workflows.

 

In addition to Kona 5 and Kumo, WePlay also used a cluster of AJA Ki Pro Ultra 12G recorders to meet the recording standards demanded by the project without any interruptions. “The devices allow us to support multi-channel HD recording or single-channel UltraHD, and we can swap out recording media on the fly, which is convenient and reliable, especially for long-format live broadcasts and when clients require high-bitrate UltraHD materials for post”, says Gutiantov.

 

Due to the special setup of WePlay Studios’ Los Angeles facility, the team developed a preview infrastructure comprising a series of Mini-Converters to facilitate 12G-SDI signal down conversion and forward 3G-SDI signals to their AJA Kumo video router. Using AJA HD5DA SDI distribution amplifiers, the team then spread preview signals across all arena monitors for more straightforward management of all preview signals.

 

The setup, which also used Salvo routing configurations for SDI signals regardless of the data source’s nature, enabled precise control over the view of the production that WePlay Studio provided to its partners, talent, camera operators, motion capture team, and the entire production crew at any moment. AJA ROI-DP DisplayPort to SDI Mini-Converters proved a key part of this preview infrastructure design, allowing the team to duplicate computer monitors into the broadcast pipeline to manage conversion with region-of-interest scaling.

 

WePlay Studios plans to open up a new virtual production studio in Los Angeles this year that will feature a screen area larger than 2,500 sq ft with a 1.8 mm pitch and the first-ever Pantone-certified LED color pipeline, utilizing advanced flip-chip technology - which will be dedicated to film and entertainment projects beyond gaming and esports.

 

(Photos: AJA Video Systems/WePlay Studios)

 

www.aja.com

www.weplayholding.com

 

© 1999 - 2024 Entertainment Technology Press Limited News Stories