Enterprise

Intel calls its AI that detects student emotions a teaching tool. Others call it 'morally reprehensible.'

Virtual school software startup Classroom Technologies will test the controversial “emotion AI” technology.

Zoom grid with emojis

The system can detect whether students are bored, distracted or confused.

Illustration: Christopher T. Fong/Protocol

When college instructor Angela Dancey wants to decipher whether her first-year English students comprehend what she’s trying to get across in class, their facial expressions and body language don’t reveal much.

"Even in an in-person class, students can be difficult to read. Typically, undergraduates don't communicate much through their faces, especially a lack of understanding,” said Dancey, a senior lecturer at the University of Illinois Chicago.

Dancey uses tried-and-true methods such as asking students to identify their "muddiest point" — a concept or idea she said students still struggle with — following a lecture or discussion. "I ask them to write it down, share it and we address it as a class for everyone's benefit," she said.

But Intel and Classroom Technologies, which sells virtual school software called Class, think there might be a better way. The companies have partnered to integrate an AI-based technology developed by Intel with Class, which runs on top of Zoom. Intel claims its system can detect whether students are bored, distracted or confused by assessing their facial expressions and how they’re interacting with educational content.

“We can give the teacher additional insights to allow them to better communicate,” said Michael Chasen, co-founder and CEO of Classroom Technologies, who said teachers have had trouble engaging with students in virtual classroom environments throughout the pandemic.

His company plans to test Intel’s student engagement analytics technology, which captures images of students’ faces with a computer camera and computer vision technology and combines it with contextual information about what a student is working on at that moment to assess a student’s state of understanding. Intel's partnership with Class is a research proof-of-concept undertaking, said Sinem Aslan, a research scientist at Intel, who helped develop the technology.

“We are trying to enable one-on-one tutoring at scale,” said Aslan, adding that the system is intended to help teachers recognize when students need help and to inform how they might alter educational materials based on how students interact with the educational content. “High levels of boredom will lead [students to] completely zone out of educational content,” said Aslan.

But critics argue that it is not possible to accurately determine whether someone is feeling bored, confused, happy or sad based on their facial expressions or other external signals.

Some researchers have found that because people express themselves through tens or hundreds of subtle and complex facial expressions, bodily gestures or physiological signals, categorizing their state with a single label is an ill-suited approach. Other research indicates that people communicate emotions such as anger, fear and surprise in ways that vary across cultures and situations, and how they express emotion can fluctuate on an individual level.

“Students have different ways of presenting what’s going on inside of them,” said Todd Richmond, a longtime educator and the director of the Tech and Narrative Lab and a professor at the Pardee RAND Graduate School. “That student being distracted at that moment in time may be the appropriate and necessary state for them in that moment in their life,” he said, if they’re dealing with personal issues, for example.

Controversial emotion AI seeps into everyday tech

The classroom is just one arena where controversial “emotion AI” is finding its way into everyday tech products and generating investor interest. It’s also seeping into delivery and passenger vehicles and virtual sales and customer service software. After Protocol's report last week on the use of this technology on sales calls, Fight for the Future launched a campaign urging Zoom not to adopt the technology in its near-ubiquitous video-conferencing software.

At this early stage, it’s not clear how Intel’s technology will be integrated with the Class software, said Chasen, who said he expects the company will partner with one of the colleges it already works with to evaluate the Intel system. Chasen told Protocol that Classroom Technologies is not paying Intel to test the technology. Class is backed by investors including NFL quarterback Tom Brady, AOL co-founder Steve Case and Salesforce Ventures.

Intel has established partnerships to help distribute other nascent forms of AI it has built. For example, in the hopes of productizing a system that turns data visualizing joints and skeletal movements into analytics to monitor and improve athletic performance, the company has partnered with Purdue University and soccer scouting app AiScout.



Educators
and advocacy groups have raised alarms regarding excessive student surveillance and privacy invasions associated with facial recognition deployed in schools for identification and security purposes. Those concerns have accelerated as AI-based software has been used more often than ever during the pandemic, including technologies that monitor student behavior in the hopes of preventing cheating during virtual testing and systems that track content that students view on their laptops in an effort to detect whether they are at risk of self-harm.

Class already tracks how often students raise their hands during a session, and offers a “proctor view” feature that lets teachers monitor what students are viewing on their computers if the students agree to share their desktop screen with instructors.

“I think we have to be very sensitive about people’s personal rights and not being overly intrusive with these systems,” said Chasen.

Cameras as a social-justice issue

As virtual class became the norm in the past couple years, a debate emerged among educators over whether or not to require students to turn on their cameras during class. Today in Dancey’s English program, cameras are optional, in part because in virtual settings students can communicate with instructors through their microphones or via chat.

But in order to capture students’ facial expressions, Intel’s technology would need those cameras turned on.

“The thing about turning cameras on, it became almost like a social-justice issue,” Dancey said. Not only are some students concerned about others seeing where or how they live, but enabling the cameras drains power, which can be a problem for students using a mobile hotspot to connect for class, she said.

“It’s kind of an invasion of privacy, and there are accessibility issues, because having your camera on uses up a huge amount of bandwidth. That could literally be costing them money to do that,” Dancey said.

We don’t want this technology to be a surveillance system.

“Students shouldn’t have to police how they look in the classroom,” said Nandita Sampath, a policy analyst with Consumer Reports focused on algorithmic bias and accountability issues, who said she wondered whether students would have the ability to contest inaccurate results if Intel’s system leads to negative consequences. “What cognitive and emotional states do these companies claim they are able to assess or predict, and what is the accountability?” she said.

Aslan said the goal of Intel’s technology is not to surveil or penalize students, but rather to coach teachers and provide additional information so they can better understand when students need help. “We did not start this technology as a surveillance system. In fact, we don’t want this technology to be a surveillance system,” Aslan said.

Sampath said Intel’s technology could be used to judge or penalize students even if that is not the intent. “Maybe they might not intend for this to be the ultimate decision-maker, but this doesn’t mean the teacher or administrator can’t use it in that way,” she said.

Dancey said teachers worry about surveillance being used against them, too. “Often surveillance is used against instructors really unfairly,” she said. “I don’t think it would be paranoid to say, especially if it’s going to measure ‘student engagement’ — TM, in quotes — that If I go up for promotion or tenure, is that going to be part of my evaluation? Could they say, ‘So-and-so had a low comprehension quotient?’”

When Intel tested the system in a physical classroom setting, some teachers who participated in the study suggested it provided useful information. “I was able to witness how I could catch some emotional challenges of the students that I could not have anticipated [before],” said one teacher, according to a document provided by Intel.

But while some teachers may have found it helpful, Dancey said she would not want to use the Intel system. “I think most teachers, especially at the university level, would find this technology morally reprehensible, like the panopticon. Frankly, if my institution offered it to me, I would reject it, and if we were required to use it, I would think twice about continuing to work here,” she said.

AI data prep by psychologists


Four days after this story was published, Intel said it wanted to offer an additional statement about its emotion AI work and partnership with Classroom Technologies.

"Intel’s partnership to test the technology in the Class software at this stage is a research proof of concept. We have no near-term plans to productize the technology and are currently collaborating with Class to conduct further research to identify any socio-technical challenges or outcomes to refine and iterate on this proof of concept," said Mindy Nelson, an Intel representative, in an email.

At this early stage, Intel aims to find the best ways to implement the technology so it is most useful for teachers, Aslan said: “How do we make it in a way that it is aligned with what the teacher does on a daily basis?”

I think most teachers, especially at the university level, would find this technology morally reprehensible.

Intel developed its adaptive learning analytics system by incorporating data gathered from students in real-life classroom sessions using laptops with 3D cameras. To label the ground truth data used to train its algorithmic models, the researchers hired psychologists who viewed videos of the students and categorized the emotions they detected in their expressions.

“We don’t want to start with any assumptions. That’s why we hired the subject matter experts to label the data,” said Nese Alyuz Civitci, a machine-learning researcher at Intel. The researchers only used data when at least two of three labelers agreed how a student’s expressions should be categorized.

“It was really interesting to see those emotions — the states are really subtle, they are really tiny differences,” Civitci said. “It was really hard for me to identify those differences.”

Rather than assessing Intel’s AI models on whether they accurately reflected the actual emotions of students, the researchers “positioned it as how instrumental or how much a teacher can trust the models,” Aslan said.

“I don’t think it’s tech that’s fully reached its maturity yet,” Chasen said regarding Intel’s system. “We need to see if the results are relevant to the performance of the students and see if we can’t get useful data for the instructors out of it. This is what we’re testing to find out.”

Ultimately, he said the Intel system will provide one piece of data that Classroom Technologies and its customers will combine with other signals to form a holistic assessment of students.

“There’s never one piece of data,” he said. He also suggested that the information revealed by the Intel technology should not be used on its own without context to judge a student’s performance, such as, “if the AI says they’re not paying attention and they have all straight As.”

This story was updated to clarify the status of Intel's work with Class Technologies and include an additional statement.

Entertainment

To clear the FTC, Microsoft’s Activision deal might require compromise

The FTC is in the process of reviewing the biggest-ever gaming acquisition. Here’s how it could change the Xbox business.

Will the Microsoft acquisition of Activision get through the FTC?

Image: Microsoft; Protocol

Microsoft’s planned acquisition of Activision Blizzard is the largest-ever deal in the video game market by a mile. With a sale price of $68.7 billion, the deal is nearly 450% larger than Grand Theft Auto publisher Take-Two Interactive’s acquisition of Zynga in January, the next-largest game acquisition ever recorded.

The eye-popping price underlines the scale and scope of Microsoft’s ambitions for its gaming business: If the deal is approved, Microsoft would own — alongside its current major properties, such as Halo and Minecraft — Warcraft, Overwatch and Call of Duty, to name just a few. In turn, the deal has invited a rare level of scrutiny and attention from lawmakers and policy professionals now turning their sights on an industry that’s flown under the regulatory radar for the last several decades of its existence.

Keep Reading Show less
Nick Statt

Nick Statt is Protocol's video game reporter. Prior to joining Protocol, he was news editor at The Verge covering the gaming industry, mobile apps and antitrust out of San Francisco, in addition to managing coverage of Silicon Valley tech giants and startups. He now resides in Rochester, New York, home of the garbage plate and, completely coincidentally, the World Video Game Hall of Fame. He can be reached at nstatt@protocol.com.

Sponsored Content

Why the digital transformation of industries is creating a more sustainable future

Qualcomm’s chief sustainability officer Angela Baker on how companies can view going “digital” as a way not only toward growth, as laid out in a recent report, but also toward establishing and meeting environmental, social and governance goals.

Three letters dominate business practice at present: ESG, or environmental, social and governance goals. The number of mentions of the environment in financial earnings has doubled in the last five years, according to GlobalData: 600,000 companies mentioned the term in their annual or quarterly results last year.

But meeting those ESG goals can be a challenge — one that businesses can’t and shouldn’t take lightly. Ahead of an exclusive fireside chat at Davos, Angela Baker, chief sustainability officer at Qualcomm, sat down with Protocol to speak about how best to achieve those targets and how Qualcomm thinks about its own sustainability strategy, net zero commitment, other ESG targets and more.

Keep Reading Show less
Chris Stokel-Walker

Chris Stokel-Walker is a freelance technology and culture journalist and author of "YouTubers: How YouTube Shook Up TV and Created a New Generation of Stars." His work has been published in The New York Times, The Guardian and Wired.

Enterprise

Okta CEO: 'We should have done a better job' with the Lapsus$ breach

In an interview with Protocol, Okta CEO Todd McKinnon said the cybersecurity firm could’ve done a lot of things better after the Lapsus$ breach of a third-party support provider earlier this year.

From talking to hundreds of customers, “I've had a good sense of the sentiment and the frustrations,” McKinnon said.

Photo: David Paul Morris via Getty Images

Okta co-founder and CEO Todd McKinnon agrees with you: Disclosing a breach that impacts customer data should not take months.

“If that happens in January, customers can't be finding out about it in March,” McKinnon said in an interview with Protocol.

Keep Reading Show less
Kyle Alspach

Kyle Alspach ( @KyleAlspach) is a senior reporter at Protocol, focused on cybersecurity. He has covered the tech industry since 2010 for outlets including VentureBeat, CRN and the Boston Globe. He lives in Portland, Oregon, and can be reached at kalspach@procotol.com.

Policy

Ethereum's co-founder thinks the blockchain can fix social media

But before the blockchain can fix social media, someone has to fix the blockchain. Frank McCourt, who’s put serious money behind his vision of a decentralized social media future, thinks Gavin Wood may be the key.

Gavin Wood, co-founder of Ethereum and creator of Polkadot, is helping Frank McCourt's decentralized social media initiative.

Photo: Jason Crowley

Frank McCourt, the billionaire mogul who is donating $100 million to help build decentralized alternatives to the social media giants, has picked a partner to make the blockchain work at Facebook scale: Ethereum co-founder Gavin Wood.

McCourt’s Project Liberty will work with the Web3 Foundation’s Polkadot project, it said Tuesday. Wood launched Polkadot in 2020 after leaving Ethereum. Project Liberty has a technical proposal to allow users to retain their data on a blockchain as they move among future social media services. Wood’s involvement is to give the idea a shot at actually working at the size and speed of a popular social network.

Keep Reading Show less
Ben Brody

Ben Brody (@ BenBrodyDC) is a senior reporter at Protocol focusing on how Congress, courts and agencies affect the online world we live in. He formerly covered tech policy and lobbying (including antitrust, Section 230 and privacy) at Bloomberg News, where he previously reported on the influence industry, government ethics and the 2016 presidential election. Before that, Ben covered business news at CNNMoney and AdAge, and all manner of stories in and around New York. He still loves appearing on the New York news radio he grew up with.

Fintech

Gensler: Bitcoin may be a commodity

The SEC has been vague about crypto. But Gensler said bitcoin is a commodity, “maybe.” It’s the clearest glimpse of his views on digital assets yet.

“Bitcoin — maybe that’s a commodity token. That has a big market value, but that goes over there,” Gensler said, referring to another regulator, the CFTC.

Photoillustration: Al Drago/Bloomberg via Getty Images; Protocol

SEC Chair Gary Gensler has long argued that many cryptocurrencies are subject to regulation as securities.

But he recently clarified that this view wouldn’t apply to the best-known cryptocurrency, bitcoin.

Keep Reading Show less
Benjamin Pimentel

Benjamin Pimentel ( @benpimentel) covers crypto and fintech from San Francisco. He has reported on many of the biggest tech stories over the past 20 years for the San Francisco Chronicle, Dow Jones MarketWatch and Business Insider, from the dot-com crash, the rise of cloud computing, social networking and AI to the impact of the Great Recession and the COVID crisis on Silicon Valley and beyond. He can be reached at bpimentel@protocol.com or via Google Voice at (925) 307-9342.

Latest Stories
Bulletins