May 2, 2024 - The global esports market is booming and projected to grow $9.29 billion by 2032[1], as the active esports player count continues to expand worldwide, and championship matches like the 2023 League of Legends World Championship Tournament draw in audiences topping 6.4 million viewers. While esports arena patrons contribute to these audience numbers, most viewers tune in via a live stream. With such a large global fanbase watching from afar, the pressure is on to create exceptional quality live productions, which is WePlay’s sweet spot. We recently sat down with the content production company’s Head of Virtual Production Aleksii Gutiantov to talk about their innovative live event production work, including a fully virtual venture it took on for the The VTuber Awards 2023 that was supported by a ton of AJA gear. We’ve compiled key interview highlights below:
Tell us more about WePlay Studios.
We’re an award-winning content production company (previously known as WePlay Esports) that fuses gaming, technology, and storytelling to craft second-to-none viewer experiences. We’ve made our mark by organizing memorable gaming shows and esports tournament experiences for top-rated titles like Dota2, CS:GO, Valorant, and Rocket League. Our efforts have earned us a Sports Emmy Award nomination and many other accolades, including NYX, Muse, and Reagan Video Awards, among others. Presently, we operate in Europe and North America, with dual headquarters in Kyiv, Ukraine, and Los Angeles, California.
What kind of clients do you serve?
Our focus is on producing creator-driven gaming shows, a venture bolstered by strategic partnerships with OTK streamers, who hold 7 percent of the entire Twitch gaming audience, and Grammy-winning Music Producer Larrance Dopson. These collaborations have led to additional work on projects like NFL Tuesday Night Gaming; the esports season for Genshin Impact; launch shows for new miHoYo game releases; and gaming shows with OTK – including the OTK Game Expo, Wheel of Chaos, and the Awards.
What makes WePlay unique?
Storytelling and technological innovation drive every show we do, and we pride ourselves on creating iconic content that leaves a lasting viewer impression. We believe that while 80 percent of an esports broadcast might naturally focus on the game, the remaining 20 percent offers an opportunity to create a truly memorable and engaging audience experience. Unlike many esports events that follow a standard template with a familiar aesthetic and structure, WePlay's events feature distinct identities; we delve beyond the games to tap into additional audience interests and weave in compelling narratives that transform broadcasts into adventures. For our AniMajor event, we married the look of Dota2 with anime, resulting in an exceptional quality production featuring anime-styled team intros, an opening ceremony teeming with Easter eggs, and bespoke content specifically crafted for the tournament.
Describe your history and current role with the company.
I lead the Virtual Production Department, where I introduce innovations to WePlay and enhance our virtual production and augmented reality (AR) offering across live broadcasts and gaming shows. I joined WePlay in 2018, initially as a freelancer specializing in virtual production and augmented reality for broadcasts, and have since worked closely with Maksym Bilonogov and Yura Lazebnikov on a number of small-scale esports events. It was during these early collaborations that WePlay began to explore the use of virtual production techniques to create the added value of real-time game analytics and data-driven AR graphics for broadcasts.
This successful partnership eventually led to my promotion as head of the Virtual Production Division. Over the years, I've steered the integration of new virtual production technologies across our global studios, which has transformed the way we tell stories for live events. It’s enabled us to add AR extensions to filming practical set locations. Such advancements have made it possible for even the smallest of analytical studios at minor events to present game statistics in a visually engaging story and easily understandable format for our audience.
I've also devoted a lot of time to developing cost-efficient methods for creating render clusters to handle real-time content processing for broadcasts. This involves customizing our approach to meet unique venue demands, enhancing VP technology efficiency, and ensuring high-quality production amidst the fast-paced and challenging environment of live esports tournaments. My responsibilities today include ensuring smooth and continuous content production for a broad range of WePlay projects; overseeing the management, design, and optimization of technological routines within my Virtual Production Division; developing new visions for the department and our annual budget; handling procurement through tender procedures; and managing collaboration with foreign contractors. Part of my role also involves providing our team with training and support on all technology like our AJA tools.
Are there any projects you can tell us about?
The VTuber Awards 2023, hosted by Filian in partnership with talent management agency Mythic agency, comes to mind. It wasn’t just a production but a milestone technology achievement. The five-hour show blended physical production facilities with extensive engineering and design. While we’d previously incorporated AR into live productions, this show marked our first foray into a fully virtual event managed with virtual cues; it’s the most challenging technological endeavor I’ve ever taken on. I managed and coordinated everything remotely from my laptop from Europe, using intercom communication with over 16 team members, and orchestrating eight days of non-stop pre-production to deliver a several hour-long broadcast.
What did the project involve?
We facilitated real-time rendering of a fully virtual VTuber character into a live virtual production using Vicon technology with 20 witness cameras for comprehensive, full-body performance capture, including precise finger movements, combined with ARKit for facial mocap data streaming. Our initial ambition was to craft a virtual character that could harmonize with and stand out from the event stage's visual ambiance, and amplify it with dynamic lighting effects.
We used AJA technology for the preview infrastructure. Due to the unique setup of our arena and its preview infrastructure requirements, we employed cross converter devices to facilitate 12G-SDI signal down conversion and forward 3G-SDI signals to an AJA KUMO 3232 video router. Then, through AJA HD5DA SDI distribution amplifiers, we spread preview signals across all arena monitors. This configuration allowed for straightforward management of all preview signals via a user-friendly web interface, enabling precise control over what our partners, talent, camera operators, our motion capture team, and the entire production crew saw at any moment using pre-programmed salvo routing configurations for SDI signals, regardless of the data source's nature. We achieved this using AJA ROI-DP DisplayPort to SDI Mini-Converters for duplicating computer monitors into the broadcast pipeline to manage conversion with region-of-interest scaling from DisplayPort to SDI.
...
CONTINUED