next-upnext upauthorJanko RoettgersNoneDo you know what's coming next up in the world of tech and entertainment? Get Janko Roettgers' newsletter every Thursday.9147dfd6b1
×

Get access to Protocol

I’ve already subscribed

Will be used in accordance with our Privacy Policy

Where should we send your daily tech briefing?

×
People

Adobe wants to use AI to make you a better dancer

The company's new tech demo could improve everyone's TikTok videos — and hints at the potential for AI to democratize video editing and visual effects.

Adobe wants to use AI to make you a better dancer

The technology can also be used to take clips and adjust movements to a different song, or take multiple clips from different sources and adjust all of them to the piece of music.

Image: Adobe

Can't dance? You're not alone.

"Syncing up music and dancing can be hard," said Adobe Research Scientist Jimei Yang during a recent interview. Not only can holding the beat be challenging for some people, but using consumer-grade recording equipment can also introduce additional delays that make the result look off-beat. "It isn't that trivial," Yang said.

That's why Yang's team at Adobe developed technology that can tweak an existing video and make dance moves match the rhythm of the music. The researchers are set to demonstrate their work, dubbed "Project On The Beat," at Adobe's annual Max conference Wednesday, but Yang gave Protocol an exclusive preview last week.

Get daily insights from the Protocol team in your inbox

This is how Adobe's dance improvement algorithms work in a nutshell:

  • First, the project uses Adobe's Sensei AI tech for skeletal tracking to identify the motions of the dancer.
  • Then, it breaks those motions down to find the strongest movements — think claps and stomps — which Adobe's researchers call "motion beats."
  • The video's music is segmented based on these key poses, and the motion beats are compared to the actual beats of the song.
  • Finally, the video is adjusted segment by segment to match the motion beats to the music. "We just need to retime the video, basically warp the video," Yang said.

The technology can also be used to take clips and adjust movements to a different song, or take multiple clips from different sources and adjust all of them to the piece of music. This even works with videos that only show a person's upper body.

Like much of the technology shown during these Adobe Max research demos, Yang's work isn't part of any Adobe product yet. However, that could change. "We are actively working with the product team," Yang said. One could imagine it as an addition to a future version of Adobe Premiere Rush, the company's mobile video-editing app for influencers and other social media users.

Beyond any such specific use cases, the demo shows the potential for AI to automate and ultimately democratize video editing. Until now, video editors would have to manually segment and adjust each part of a video to synchronize music and movement with desktop video-editing software. By letting AI do the work, the same can be done faster, and on mobile devices, literally putting this kind of video editing into many more hands. "AI is really changing the video editing business, especially for amateurs," Yang said.

A version of this story will appear in our entertainment newsletter, Next Up. Sign up here to get it in your inbox every Thursday.

Latest Stories