next-upnext upauthorJanko RoettgersNext Up NewsletterDo you know what's coming next up in the world of tech and entertainment? Get Janko Roettgers' newsletter every Thursday.9147dfd6b1
×

Get access to Protocol

I’ve already subscribed

Will be used in accordance with our Privacy Policy

Where should we send your daily tech briefing?

×
Protocol Next Up
Defining the future of tech and entertainment with Janko Roettgers.

How your local weather forecast is using Hollywood tricks to get more extreme

How your local weather forecast is using Hollywood tricks to get more extreme

Good morning, and welcome to Protocol Next Up, a weekly newsletter about the future of technology and entertainment. This week, Next Up is all about advancements in virtual TV production and the potential that AI has to democratize video editing.

Coming up on Oct. 28: TV's Tipping Point, Protocol's first online event about the future of tech and entertainment, featuring CBS News Digital GM & EVP Christy Tanner, Tubi CEO Farhad Massoudi, Cinedigm President Erick Opeka and Wurl CEO Sean Doherty. RSVP now!

The Big Story

How The Weather Channel uses mixed reality for local forecasts

On Wednesday, The Weather Channel meteorologist Stephanie Abrams went to Miami to report about a budding storm, complete with heavy downpours and blinding thunderstorms. Only, Abrams didn't have to don a raincoat for the segment. Instead, she filmed it in the network's studio in Atlanta using mixed reality. "I'm glad I get to be in the weather without getting wet," Abrams said during a press briefing this week.

The Weather Channel has used mixed reality to report about extreme weather events for a little over two years now, with clips about tornadoes and storm surges clocking millions of views on YouTube. In June, the network doubled down on those efforts with a new virtual studio that has since been used to air a dozen or so mixed reality segments every day.

This week, The Weather Channel began using the technology for local forecasts as well, putting the network at the forefront of a new trend to embrace mixed reality and virtual production technologies for live television broadcasts. "It's a more effective way to communicate what the forecast is," Director of Weather Presentation Mike Chesterfield said.

Here's how the Weather Channel's new Virtual View local forecasts work:

  • The network went to 50 cities across the country, including New York, Los Angeles and Chicago, to shoot photos that could be used as virtual backgrounds. Photographers had to keep a close eye on view distance, exposure and lighting conditions to make sure the photos would work, explained Senior Technical Artist Warren Drones: "You can't just take any form of imagery."
  • At the same time, picking iconic locations was key. "We actually want it to look like New York," Drones said. That imagery, and the associated lighting data, was then imported into Epic's Unreal Engine, which is core to The Weather Channel's real-time production.
  • In the studio in Atlanta, Abrams and her colleagues record Virtual View segments on a 540-square-foot green screen stage. They're filmed with three cameras equipped with trackers that make it possible to reproduce every camera move in the virtual environment.
  • This green-screen footage is combined in real time with the on-location photos and visual effects, such as rain and virtual screens. On-screen meteorologists get to see the result on multiple monitors, but otherwise only have a few taped markers on the ground to orient themselves in the barren green-screen environment. "You have to have a lot of hand-eye coordination," Abrams said.
  • Viewers get to see meteorologists standing on a simulated round gray disc that is being used as a "cone of safety" — a visual aid to signal that Abrams isn't actually in danger of getting struck by lightning.

Hollywood is increasingly embracing Unreal and other virtual production environments for visual effects.

  • Lucasfilm's ILM has developed elaborate virtual production tools to shoot effects-laden scenes for blockbuster franchises like "Star Wars" and the Marvel Cinematic Universe.
  • Directors like the technology because it allows them to make visual effects an essential part of the shoot, as opposed to something that is being added to footage after the fact.

But TV, and especially unscripted daytime television, has been slower to adopt real-time production. With smaller budgets and tight turn-around schedules, the industry has largely been sticking to known production workflows, and has often treated new technology as a gimmicky add-on (get ready for Election Day holograms).

However, The Weather Channel is arguably the furthest ahead when it comes to embracing this type of technology for daily productions, and in turn demonstrates where the entire industry is likely moving in the coming years — which coincidentally aligns very well with the core mission of a network that is all about forecasting.

  • "Our job is to tell the future," the company's Chief Content Officer Nora Zimmett said.

Overheard

"Quest was promising for VR software sales, but Quest 2 appears to be life changing." UploadVR editor Ian Hamilton on early feedback from VR developers on the massive impact the launch of the Quest 2 headset has had on their app sales — something my colleague Shakeel Hashim has been hearing as well.

"Think of that as a minor background effect." Netflix co-CEO Reed Hastings, during the company's Q3 2020 earnings call, about the long-term effect COVID will have on the company's business.

JOIN US NEXT WEEK

TV's Tipping Point

Join Janko Roettgers next Wednesday at noon ET to answer the question: Has TV reached a tipping point? You will hear from industry experts including Tubi founder and CEO Farhad Massoudi, Cinedigm President Erick Opeka, Wurl CEO Sean Doherty and CBS News Digital EVP and GM Christy Tanner. The event is presented by Roku.

RSVP here.

Watch Out

Adobe wants to use AI to make you a better dancer

Can't dance? You're not alone.

"Syncing up music and dancing can be hard," said Adobe Research Scientist Jimei Yang during a recent interview. Not only can holding the beat be challenging for some people, but using consumer-grade recording equipment can also introduce additional delays that make the result look off-beat. "It isn't that trivial," Yang said.

That's why Yang's team at Adobe developed technology that can tweak an existing video and make dance moves match the rhythm of the music. The researchers demonstrated their work, dubbed "Project On The Beat," at Adobe's annual Max conference, but Yang gave Protocol an exclusive preview last week.

This is how Adobe's dance improvement algorithms work in a nutshell:

  • First, the project uses Adobe's Sensei AI tech for skeletal tracking to identify the motions of the dancer.
  • Then, it breaks those motions down to find the strongest movements — think claps and stomps — which Adobe's researchers call "motion beats."
  • The video's music is segmented based on these key poses, and the motion beats are compared to the actual beats of the song.
  • Finally, the video is adjusted segment by segment to match the motion beats to the music. "We just need to retime the video, basically warp the video," Yang said.

The technology can also be used to take clips and adjust movements to a different song, or take multiple clips from different sources and adjust all of them to the piece of music. This even works with videos that only show a person's upper body.

Like much of the technology shown during these Adobe Max research demos, Yang's work isn't part of any Adobe product yet. However, that could change. "We are actively working with the product team," Yang said. One could imagine it as an addition to a future version of Adobe Premiere Rush, the company's mobile video-editing app for influencers and other social media users.

Beyond any such specific use cases, the demo shows the potential for AI to automate and ultimately democratize video editing.

  • Until now, video editors would have to manually segment and adjust each part of a video to synchronize music and movement with desktop video-editing software.
  • By letting AI do the work, the same can be done faster, and on mobile devices, literally putting this kind of video editing into many more hands. "AI is really changing the video editing business, especially for amateurs," Yang said.

Fast Forward

Qubi is shutting down. Even $1.75 billion in funding wasn't enough to convince people of a short-form video service that banked on mobile during shelter-in-place.

Disney's international plans put Hulu on the back burner. Hulu always wanted to expand internationally. Disney will push its Indian Star brand instead, Bloomberg reports.

Sonos smart speakers are integrating with GE appliances. Embracing IoT is an interesting move for Sonos, but will the company be able to catch up with Google, Apple and Amazon?

On Protocol: Amazon wants you to use Alexa to buy what you see on TV. People have been talking about this idea forever. There's even a ghastly name for it (T-Commerce). But will it ever actually happen?

Also on Protocol: Netflix plans a 48-hour free StreamFest promotional event. The company is going to test this idea in India first, according to COO Greg Peters.

The average video subscriber will have 5.7 subscriptions by 2024, up from 4.1 today. This and lots other interesting tidbits can be found in Activate's new Technology & Media Outlook.

Microsoft researchers have built a haptics gizmo that makes throwing a ball in VR more believable. Neat! Now they just need to do this for every other object in the world.

Amazon is improving the Alexa experience on Fire TV. With these updates, TVs work a bit more like smart displays. Next up: More TVs with far-field microphones?

Auf Wiedersehen

The first time I ever tried a Magic Leap AR headset, I was amazed by a demo that featured a portal: an AR overlay that made it look like someone had ripped a hole in the floor right in front of me, opening up our world to another dimension full of strange creatures. Then, I saw the same kind of portal in another demo. And another. It quickly became a kind of AR parlor trick, used by developers to brush over some of the other shortcomings of the medium.

However, when I recently saw a tweet from self-declared mad scientist Lucas Rizotto about building an AR portal to keep in touch with his best friend during quarantine, I realized that I got it all wrong. Portals are actually great, and even better if you have an IoT sledgehammer! Go watch the full video on YouTube, and then dream up your own portal — but perhaps wait for the technology to advance a bit before you bust out the sledgehammer.

Thanks for reading — see you next week!

Recent Issues