Workplace

The rise of tech ethicists shows how the industry is changing

Though the job titles are new, the ways to attract new talent are virtually the same.

Tethics

In 2022, “responsible tech” is a career path.

Illustration: Protocol

Rebekah Tweed first became interested in tech ethics as a music reporter. She was covering a Taylor Swift concert, and the artist had accidentally courted controversy after using facial recognition technology to scan crowds for stalkers.

“It struck me that they weren’t fully thinking through the implications of some of these technologies,” she explained. “So I started digging.”

What she found was that she was a few years behind the trend. Researchers and academics have been analyzing technology through an ethics lens for decades, and began reaching out of their siloed fields in a significant way after the 2016 presidential election.

In 2022, “responsible tech” is a career path. Job titles range from “trust and safety officer” to “policy lead.” And several organizations and academic institutions are engaged in ecosystem-mapping projects to define which academic programs best prepare students to work in the field, how the jobs are described and what companies are pursuing ethical tech in earnest.

“There's a lot of appetite for this, especially as the public has become very aware of highly publicized problems with technology,” Tweed, now the program director for All Tech is Human, said. “I see that continuing to grow for the foreseeable future.”

Agreeing upon a common nomenclature has been one of the first hurdles for experts defining the field. “Responsible tech” is meant to emphasize that technology should be responsive to the needs of users. New America calls the field “public interest technology.” Cal Poly San Luis Obispo uses the phrase “tech ethics,” in the context of academic programming they call “The Ethical Tech Project” (no, not as a callout to that 2019 “Silicon Valley” episode).

“There’s a lot of similar people interested in similar types of work but in different sectors,” said Tweed. “There just wasn’t really a common nomenclature.”

One of the core organizations defining the space is New America and its Public Interest Technology University Network, a collaboration between 48 different institutions with academic programming relevant to the subject. Through the network, New America has invested more than $11.6 million in projects that help build clear career pipelines. The goal is to organize the most urgent ethical problems in tech and identify the skills needed to solve those problems. To do this, Stanford launched a “Public Interest Technology Career Taxonomy Project,” where student researchers index words in job descriptions for use by LinkedIn and hiring managers.

The Stanford project is based partly on previous research used to define and develop a pipeline for “green” jobs by finding the keywords that job searchers would use. The project is part of a larger, student-run university research program called the PIT Lab, which helps prepare students for careers in the field by hosting panels, discussions and connecting students with similar fellowship opportunities.

Andreen Soley, director of the PIT program at New America, said the member network demonstrates how a defining element of the field is a focus on interdisciplinary expertise. “This is not just the purview of computer scientists,” she said. “One part of our conceptualization is to say you might need a sociologist to be a part of this conversation, or someone from another field.”

Notably, New America doesn’t have a perfect record when it comes to working with tech. The foundation has received at least $21 million from Google since its inception, and Eric Schmidt, who was chairman of New America until 2016, was simultaneously executive chairman of Alphabet from 2015 to 2018. Google has allegedly used this arrangement to exert influence over the organization, according to The New York Times. Both the company and New America have denied that there is an improper relationship.

Ethical Tech @ Cal Poly, led by professors Deb Donig and Matthew Harsh, also intends to map the responsible tech ecosystem, but specifically through the frame of linking academia to practice. The university as a whole follows an ethos of “learning by doing,” Donig explained, and practically speaking, there isn’t just a forthcoming generation of new tech ethicists: There’s an entirely new industry on the horizon.

“We believe this is a new profession, not a new workforce,” she said. “There should be students who are trained, as an outcome of their graduation, to be ethical technologists.”

One element of the program is a course Donig teaches, called “Technically Human.” The course is listed as part of the English Department, but students of all majors are invited to participate. It focuses on the stories people tell about technology, whether it be nonfiction narratives like the rise and fall of ex-Uber CEO Travis Kalanick and ex-WeWork CEO Adam Neumann or science-fiction stories like “2001: A Space Odyssey.” These stories, she said, guide students' understanding of where the ethical issues lie in technology, who is responsible for them and creative ways technologists can create better tech — all skills she believes are necessary to pursue a career in ethical tech.

Though the job titles are new, the ways to attract new talent haven’t changed. New America sponsored a career fair as part of a larger online convention last October, with the help of Tweed, professors Mona Sloane and Matthew Statler at NYU, and several other academic, governmental and nonprofit groups. The career fair portion was decentralized, and used the job searching software Handshake to connect students across the country with jobs that hiring managers were actively recruiting for on their campuses. Tweed is also working on another career fair taking place this May through a collaboration with Stanford, Pepperdine, University of Washington and New America.

Ethical tech jobs currently exist at small startups dedicated to the field and giant tech companies that build the tools we use every day. Deciding which companies are serious about ethics, though, is easier said than done. Many of the biggest tech companies have dedicated research teams focused on the impacts of their products, yet are also responsible for most of the scandals with which tech ethicists are most concerned. Google announced it would be expanding its AI ethics research department to 200 people last summer — after messily firing two of its most esteemed researchers for publishing a paper that found major flaws in the company’s language processing models.

Whether young technologists should take jobs like those weighs heavily on the minds of some experts in the space. “I am very concerned to send my ethically educated, diverse, extremely energetic students into abusive and oppressive workplaces,” said Sloane, one of the two NYU professors who helped organize the October job fair.

Sloane and Statler said more students than ever are interested in entering the burgeoning field. And though students with debt may have less flexibility in the job they choose versus ones with more financial freedom, fewer students are having to make a choice. “Top talent will go to places that take these issues seriously,” she said. “This is a talent, attraction and retention issue, bottom line.”

Through her work with All Tech is Human, Tweed has created a running Responsible Tech Job Board, which lists positions at companies as large and well-known as Google and as niche as senior research positions at issue-specific think tanks. It’s a simple, continuously-updating spreadsheet, with over 300 roles listed currently.

With the help of All Tech is Human’s founder David Ryan Polgar, Tweed has settled on three core criteria for positions she lists on the job board: The roles either focus on reducing the harms of technology, diversifying the tech pipeline or aligning new tech with the public interest. But still, she said that some companies play tricks. She declined to name names, but said that it’s important to note a company’s historical record in the field, and watch out for job descriptions with responsible tech language that looks misused or doesn’t quite match the role’s core responsibilities.

“Let me just say that some companies are catching on to the fact that there is a large community of people who care about a responsible tech focus, and care about building products that are responsible and beneficial to society,” said Tweed. “I’ve noticed that, over time, I’m having to be a little more judicious in determining what a role actually does or researching whether this company actually cares about the issues.” Often, she said, the answers aren’t black and white.

There are already many students and young professionals readily willing to pursue a career in public interest tech, and plenty of research institutes and academic programs building a bank of knowledge for the sector. But the biggest gap, Tweed said, is in connecting those aspiring professionals, and the information they’ve learned, to applicable use cases in the real world.

When she decided to pivot into the responsible tech industry in 2018, she “had trouble even tracking down these roles, much less finding a good fit,” she said. “Now there’s so many jobs that could fit on the job board that I’m not even able to include everything.”

Fintech

Judge Zia Faruqui is trying to teach you crypto, one ‘SNL’ reference at a time

His decisions on major cryptocurrency cases have quoted "The Big Lebowski," "SNL," and "Dr. Strangelove." That’s because he wants you — yes, you — to read them.

The ways Zia Faruqui (right) has weighed on cases that have come before him can give lawyers clues as to what legal frameworks will pass muster.

Photo: Carolyn Van Houten/The Washington Post via Getty Images

“Cryptocurrency and related software analytics tools are ‘The wave of the future, Dude. One hundred percent electronic.’”

That’s not a quote from "The Big Lebowski" — at least, not directly. It’s a quote from a Washington, D.C., district court memorandum opinion on the role cryptocurrency analytics tools can play in government investigations. The author is Magistrate Judge Zia Faruqui.

Keep ReadingShow less
Veronica Irwin

Veronica Irwin (@vronirwin) is a San Francisco-based reporter at Protocol covering fintech. Previously she was at the San Francisco Examiner, covering tech from a hyper-local angle. Before that, her byline was featured in SF Weekly, The Nation, Techworker, Ms. Magazine and The Frisc.

The financial technology transformation is driving competition, creating consumer choice, and shaping the future of finance. Hear from seven fintech leaders who are reshaping the future of finance, and join the inaugural Financial Technology Association Fintech Summit to learn more.

Keep ReadingShow less
FTA
The Financial Technology Association (FTA) represents industry leaders shaping the future of finance. We champion the power of technology-centered financial services and advocate for the modernization of financial regulation to support inclusion and responsible innovation.
Enterprise

AWS CEO: The cloud isn’t just about technology

As AWS preps for its annual re:Invent conference, Adam Selipsky talks product strategy, support for hybrid environments, and the value of the cloud in uncertain economic times.

Photo: Noah Berger/Getty Images for Amazon Web Services

AWS is gearing up for re:Invent, its annual cloud computing conference where announcements this year are expected to focus on its end-to-end data strategy and delivering new industry-specific services.

It will be the second re:Invent with CEO Adam Selipsky as leader of the industry’s largest cloud provider after his return last year to AWS from data visualization company Tableau Software.

Keep ReadingShow less
Donna Goodison

Donna Goodison (@dgoodison) is Protocol's senior reporter focusing on enterprise infrastructure technology, from the 'Big 3' cloud computing providers to data centers. She previously covered the public cloud at CRN after 15 years as a business reporter for the Boston Herald. Based in Massachusetts, she also has worked as a Boston Globe freelancer, business reporter at the Boston Business Journal and real estate reporter at Banker & Tradesman after toiling at weekly newspapers.

Image: Protocol

We launched Protocol in February 2020 to cover the evolving power center of tech. It is with deep sadness that just under three years later, we are winding down the publication.

As of today, we will not publish any more stories. All of our newsletters, apart from our flagship, Source Code, will no longer be sent. Source Code will be published and sent for the next few weeks, but it will also close down in December.

Keep ReadingShow less
Bennett Richardson

Bennett Richardson ( @bennettrich) is the president of Protocol. Prior to joining Protocol in 2019, Bennett was executive director of global strategic partnerships at POLITICO, where he led strategic growth efforts including POLITICO's European expansion in Brussels and POLITICO's creative agency POLITICO Focus during his six years with the company. Prior to POLITICO, Bennett was co-founder and CMO of Hinge, the mobile dating company recently acquired by Match Group. Bennett began his career in digital and social brand marketing working with major brands across tech, energy, and health care at leading marketing and communications agencies including Edelman and GMMB. Bennett is originally from Portland, Maine, and received his bachelor's degree from Colgate University.

Enterprise

Why large enterprises struggle to find suitable platforms for MLops

As companies expand their use of AI beyond running just a few machine learning models, and as larger enterprises go from deploying hundreds of models to thousands and even millions of models, ML practitioners say that they have yet to find what they need from prepackaged MLops systems.

As companies expand their use of AI beyond running just a few machine learning models, ML practitioners say that they have yet to find what they need from prepackaged MLops systems.

Photo: artpartner-images via Getty Images

On any given day, Lily AI runs hundreds of machine learning models using computer vision and natural language processing that are customized for its retail and ecommerce clients to make website product recommendations, forecast demand, and plan merchandising. But this spring when the company was in the market for a machine learning operations platform to manage its expanding model roster, it wasn’t easy to find a suitable off-the-shelf system that could handle such a large number of models in deployment while also meeting other criteria.

Some MLops platforms are not well-suited for maintaining even more than 10 machine learning models when it comes to keeping track of data, navigating their user interfaces, or reporting capabilities, Matthew Nokleby, machine learning manager for Lily AI’s product intelligence team, told Protocol earlier this year. “The duct tape starts to show,” he said.

Keep ReadingShow less
Kate Kaye

Kate Kaye is an award-winning multimedia reporter digging deep and telling print, digital and audio stories. She covers AI and data for Protocol. Her reporting on AI and tech ethics issues has been published in OneZero, Fast Company, MIT Technology Review, CityLab, Ad Age and Digiday and heard on NPR. Kate is the creator of RedTailMedia.org and is the author of "Campaign '08: A Turning Point for Digital Media," a book about how the 2008 presidential campaigns used digital media and data.

Latest Stories
Bulletins