How your local weather forecast is using Hollywood tricks to get more extreme

To give you the best possible experience, this site uses cookies. If you continue browsing. you accept our use of cookies. You can review our privacy policy to find out more about the cookies we use.
Good morning, and welcome to Protocol Next Up, a weekly newsletter about the future of technology and entertainment. This week, Next Up is all about advancements in virtual TV production and the potential that AI has to democratize video editing.
Coming up on Oct. 28: TV's Tipping Point, Protocol's first online event about the future of tech and entertainment, featuring CBS News Digital GM & EVP Christy Tanner, Tubi CEO Farhad Massoudi, Cinedigm President Erick Opeka and Wurl CEO Sean Doherty. RSVP now!
On Wednesday, The Weather Channel meteorologist Stephanie Abrams went to Miami to report about a budding storm, complete with heavy downpours and blinding thunderstorms. Only, Abrams didn't have to don a raincoat for the segment. Instead, she filmed it in the network's studio in Atlanta using mixed reality. "I'm glad I get to be in the weather without getting wet," Abrams said during a press briefing this week.
The Weather Channel has used mixed reality to report about extreme weather events for a little over two years now, with clips about tornadoes and storm surges clocking millions of views on YouTube. In June, the network doubled down on those efforts with a new virtual studio that has since been used to air a dozen or so mixed reality segments every day.
This week, The Weather Channel began using the technology for local forecasts as well, putting the network at the forefront of a new trend to embrace mixed reality and virtual production technologies for live television broadcasts. "It's a more effective way to communicate what the forecast is," Director of Weather Presentation Mike Chesterfield said.
Here's how the Weather Channel's new Virtual View local forecasts work:
Hollywood is increasingly embracing Unreal and other virtual production environments for visual effects.
But TV, and especially unscripted daytime television, has been slower to adopt real-time production. With smaller budgets and tight turn-around schedules, the industry has largely been sticking to known production workflows, and has often treated new technology as a gimmicky add-on (get ready for Election Day holograms).
However, The Weather Channel is arguably the furthest ahead when it comes to embracing this type of technology for daily productions, and in turn demonstrates where the entire industry is likely moving in the coming years — which coincidentally aligns very well with the core mission of a network that is all about forecasting.
"Quest was promising for VR software sales, but Quest 2 appears to be life changing." UploadVR editor Ian Hamilton on early feedback from VR developers on the massive impact the launch of the Quest 2 headset has had on their app sales — something my colleague Shakeel Hashim has been hearing as well.
"Think of that as a minor background effect." Netflix co-CEO Reed Hastings, during the company's Q3 2020 earnings call, about the long-term effect COVID will have on the company's business.
Join Janko Roettgers next Wednesday at noon ET to answer the question: Has TV reached a tipping point? You will hear from industry experts including Tubi founder and CEO Farhad Massoudi, Cinedigm President Erick Opeka, Wurl CEO Sean Doherty and CBS News Digital EVP and GM Christy Tanner. The event is presented by Roku.
Can't dance? You're not alone.
"Syncing up music and dancing can be hard," said Adobe Research Scientist Jimei Yang during a recent interview. Not only can holding the beat be challenging for some people, but using consumer-grade recording equipment can also introduce additional delays that make the result look off-beat. "It isn't that trivial," Yang said.
That's why Yang's team at Adobe developed technology that can tweak an existing video and make dance moves match the rhythm of the music. The researchers demonstrated their work, dubbed "Project On The Beat," at Adobe's annual Max conference, but Yang gave Protocol an exclusive preview last week.
This is how Adobe's dance improvement algorithms work in a nutshell:
The technology can also be used to take clips and adjust movements to a different song, or take multiple clips from different sources and adjust all of them to the piece of music. This even works with videos that only show a person's upper body.
Like much of the technology shown during these Adobe Max research demos, Yang's work isn't part of any Adobe product yet. However, that could change. "We are actively working with the product team," Yang said. One could imagine it as an addition to a future version of Adobe Premiere Rush, the company's mobile video-editing app for influencers and other social media users.
Beyond any such specific use cases, the demo shows the potential for AI to automate and ultimately democratize video editing.
Qubi is shutting down. Even $1.75 billion in funding wasn't enough to convince people of a short-form video service that banked on mobile during shelter-in-place.
Disney's international plans put Hulu on the back burner. Hulu always wanted to expand internationally. Disney will push its Indian Star brand instead, Bloomberg reports.
Sonos smart speakers are integrating with GE appliances. Embracing IoT is an interesting move for Sonos, but will the company be able to catch up with Google, Apple and Amazon?
On Protocol: Amazon wants you to use Alexa to buy what you see on TV. People have been talking about this idea forever. There's even a ghastly name for it (T-Commerce). But will it ever actually happen?
Also on Protocol: Netflix plans a 48-hour free StreamFest promotional event. The company is going to test this idea in India first, according to COO Greg Peters.
The average video subscriber will have 5.7 subscriptions by 2024, up from 4.1 today. This and lots other interesting tidbits can be found in Activate's new Technology & Media Outlook.
Microsoft researchers have built a haptics gizmo that makes throwing a ball in VR more believable. Neat! Now they just need to do this for every other object in the world.
Amazon is improving the Alexa experience on Fire TV. With these updates, TVs work a bit more like smart displays. Next up: More TVs with far-field microphones?
The first time I ever tried a Magic Leap AR headset, I was amazed by a demo that featured a portal: an AR overlay that made it look like someone had ripped a hole in the floor right in front of me, opening up our world to another dimension full of strange creatures. Then, I saw the same kind of portal in another demo. And another. It quickly became a kind of AR parlor trick, used by developers to brush over some of the other shortcomings of the medium.
However, when I recently saw a tweet from self-declared mad scientist Lucas Rizotto about building an AR portal to keep in touch with his best friend during quarantine, I realized that I got it all wrong. Portals are actually great, and even better if you have an IoT sledgehammer! Go watch the full video on YouTube, and then dream up your own portal — but perhaps wait for the technology to advance a bit before you bust out the sledgehammer.
Thanks for reading — see you next week!