Live VR Olympics Coverage Portends New Era of Panoramic Sports Viewing

NBC Leverages Intel Platform as Barriers to Bandwidth-Efficient Streaming Fall

By Fred Dawson

January 25, 2018 – Virtual reality is about to move farther than ever into mainstream entertainment with immersive coverage from the Winter Olympics kicking off what promises to be an escalating pace of wide-field sports broadcasts in 2018 and beyond.

NBC Sports and Intel are collaborating on plans to deliver 15 or more events live and another 15 or so in on-demand mode from the games in PyeongChang, South Korea, with VR coverage that allows viewers to look at anything that’s happening within a 1800 field of vision at multiple viewing positions by simply turning their heads. “This is nothing short of reinventing the way fans engage with content,” says David Aufhauser, managing director at Intel Sports.

Previously reported technical advances and trials over the past year have set the stage for introduction of TV-caliber network VR services, including episodic as well as sports programming. But it’s sports broadcasting where the path to creating experiences with mass market appeal seems most direct.

Much of what goes into such efforts will leverage consensus on approaches to producing and delivering content that have been achieved through the auspices of the Virtual Reality Industry Forum (VRIF). Although the forum’s first set of specifications, issued at CES in early January, target on-demand use cases, specs for live use cases are under preparation for release by year’s end.

Achieving multi-platform interoperability is crucial to meeting the mass audience requirements of a broadcaster like NBC. Intel, a charter member of VRIF, has built a comprehensive suite of volumetric production capabilities that will support maximum audience reach in a wide range of VR scenarios, Aufhauser says.

“One of the things we focused on in developing our True View technology was to make sure it works across multiple devices and platforms,” he notes. This means not only delivering stereoscopic 3D viewing to people using different types of VR head-mounted devices (HMDs) but also enabling panoramic 2D viewing on handheld devices without HMDs.

Two major hurdles have been cleared to make such a service possible. Wide-field cameras capable of capturing 1800 or even 3600 of the visual space have eliminated the jarring discontinuities caused by stitching much narrower swaths of the space together. And VR experiences can be supported with new levels of bandwidth efficiency through a process known as “tiling,” whereby the transmitted field of vision or, in VR parlance, “viewport,” encompassing what a viewer sees at any instant in time is filled in with degrees of resolution mapped to how the eye registers fields of vision in real life.

Still to be resolved are the low resolution limitations of the HMD displays. While, as previously reported, new high-end HMDs are supporting better resolution in the 2400-x-1200 pixel range, the achievement of the high-resolution consumers are accustomed to with 4K or even HD TV panels is still a couple of years away, especially when it comes to mid- and low-priced HMDs.

Nonetheless, if what NBC Sports has cooked up performs as billed, it’s hard to imagine VR sports programming won’t soon be in high demand. Users who download the recently released Intel-based NBC Sports VR app will be able to watch marquee events every day in 3D stereoscopic VR mode using any of several HMD platforms, including Google Daydream View, Samsung GearVR and Windows Mixed Reality, which runs on a variety of OEM HMDs designed for the Microsoft platform. The content will also be made available for non-HMD 2D viewing on iOS and Android devices with navigation executed by screen swiping across the field of vision.

Events selected for VR coverage such as alpine skiing, snowboarding, ice hockey and speed and figure skating will be captured by camera pods in three to six locations per event. Viewers switching from one location to the next as an event unfolds will be able look in all directions over the1800 field to see what’s going on.

Other features include: post-event highlights delivered for VR viewing as well as on-demand availability of the full VR-covered events; text providing names of athletes; real-time stats and leaderboards accessible in VR viewing mode, and audio integration that enables an immersive sound experience at each vantage point. The coverage will also include a director’s cut enabling a lean-back immersive experience with picture-in-picture support that facilitates toggling between self-selected views and the director’s cut.

All of this requires the ability to process an immense amount of data in real time, going well beyond what’s needed with traditional 2D live capture and streaming. This starts with mapping all the data generated by the “voxels,” the cubed volumetric pixels representing height, width and depth, into a volumetric rendition of the playing field and everything happening moment to moment. In addition, productions have to process all the metadata tied to each event and participant wherever a user chooses to look. And there’s a lot to handle when it comes to maintaining quality assurance, including ensuring smooth operations with the constant stream of data flowing back and forth between user and the CDN to enable instantaneous view shifts with each turn of the head.

The mapping of the voxels, of course, is intrinsic to the True View production platform, but other aspects require support from third parties. “We’re pioneering new ways to interact and how the content is managed,” Aufhauser says. “How we manage all this data and operate the programs across multiple rights holders and multiple apps is a major challenge.”

Citing the support Intel is getting from online publisher Ooyala’s Flex media logistics platform, he adds, “Flex has been the right solution for that. We’re working with them on a day-to-day basis.”

“No matter how much metadata piles up, the end points have to be selective in finding and applying relevant information with each session in that location,” notes Glen Sakata, senior account executive at Ooyala. “You have to have a very dynamic way of handling this.”

The quality control aspects are especially daunting, he adds. “When things go wrong you have to know what do. A lot of things can happen whether it’s on the backbones, in the various cloud services – AWS, Alibaba, Azure. You can’t wait for people to push a button.”

Notwithstanding the challenges, Intel is throwing a lot of effort into VR. Last year the company began working with the National Football League and Major League Baseball to engage various teams in use of the technology to deliver game highlights. For example, Intel worked with 11 NFL teams using 30 to 50 5K JAI cameras around stadium perimeters to capture the entire field of action, enabling users to zoom on whatever they wanted to watch during the replay.

At CES Intel announced the opening of a Los Angeles-based studio dedicated to VR and augmented reality (AR) productions. “With Intel Studios we’re going into the fully immersive world of video production,” Aufhauser says.

The facility features what is billed as the world’s largest volumetric production stage, a dome-shaped structure measuring 10,000 square feet at the base where a bevy of VR cameras feed captured data over fiber cables to Intel-powered servers capable of processing up to six terabytes per minute. Intel expects movie studios, including the first announced partner, Paramount, as well as broadcasters, ad agencies and other commercial content producers to make use of the facility for live as well as episodic productions.

Another signal as to what’s in store came during Intel CEO Bryan Krzanich’s keynote speech at CES, where he was joined by former Dallas Cowboys quarterback Tony Romo in a demonstration a True View “be-the-player” app. Without indicating when the capability will be commercialized, they showed how viewers can watch plays in a football game unfold from the moving vantage of any player.

Aufhauser also points to social media applications as a major area of opportunity for the Intel technology. Indeed, according to Alexis Macktin, an analyst at VR researcher Greenlight Insights, social interaction in VR mode is a major area of interest. Among consumers who have a strong interest in VR, “67 percent are interested in being able to interact socially,” Macktin says. “When we asked active users who are interested in using VR features every day to name what the daily use cases would be, social features were mentioned more than any others.”

Contrary to fears that VR threatens to isolate people from one another, Macktin says the social experience has become an important component of people’s engagement with VR in public locations, such as theme parks, IMAX centers, kiosks and, especially in the APAC region, Internet cafes. “Location-based VR is bringing people together in the VR experience,” she says, noting that Greenlight is projecting this industry segment will be generating $8 billion in annual global revenues by 2022.

Macktin also notes the role industrial use of VR is playing in popularizing the technology. “We expect that sector to grow a lot this year,” she says. Overall, Greenlight believes VR in all its permutations will be a $175-billion contributor to the global economy by 2022.

At this early stage it’s the 3600 immersive viewing experience with entertainment content that is galvanizing the most consumer attention. “3600 video is the gateway for consumers and for brands as well,” Macktin says. In its 2017 U.S. consumer adoption surveys, the firm found that 44 percent had seen 3600 video.

No wonder, then, that the VRIF has focused its first set of specifications, released at CES, on defining an interoperable environment for producing and distributing immersive 3600 content. In so doing, the group has directed a lot of attention on steps to enabling a practical delivery mode for such content that avoids the vertigo-inducing practices that plagued early iterations of VR.

“We don’t want to upset the consumer, make anyone ill in any way,” says Sky broadcast chief engineer Chris Johns, a VRIF vice president who serves as co-chair of the forum’s Production Task Force. “We want an entertaining, interactive experience.”

The speed at which technological advances have brought VR to the point of practical operations in the streaming services arena has taken many people by surprise. NAGRA, for example, engaged with the VRIF early on but pulled back in light of the uncertainties surrounding the technology.

“VR is an area where we’ve taken more of a follow position, mindful of the 3-D risk from a few years ago,” says Simone Trudelle, senior product marketing director at NAGRA. “But at this stage it’s clear VR has gone through all the right gates and will become an industry of its own that’s tied to traditional content distribution. The VR Forum is really drilling down on the details and optimizing VR for mass distribution. We’re currently in the process of rejoining VRIF.”

Confirmation that the Olympic VRcast could be the start of a major trend can be found in network service providers’ preparations for utilizing managed broadband networks to provide the bandwidth and super low-latency bi-directional communications essential to delivering a compelling volumetric experience. “This technology represents an extremely promising and powerful opportunity, and it is imperative that we work together to create a powerful experience for users out of the gate,” says Christian Egeler, director of XR (Extended Reality) product development at Verizon, which is a member of the VRIF.

Cisco Systems has a bird’s eye view of these efforts, notes Sean Welch, vice president and general manager for cable access solutions in Cisco’s Service Provider Business group. “We’re engaged with four cable MSOs that are looking at providing 3600 services,” Welch says