Three Universities, National Science Foundation Take on AI’s Trustworthiness

The Institute for Trustworthy AI in Law & Society will address trustworthiness and biases in AI by involving diverse groups of people in every step of creation and execution, a concept called participatory design. The charge is led by the University of Maryland, The George Washington University, and Morgan State University.
By
portrait of Evan Castillo
Evan Castillo
Read Full Bio

Editor & Writer

Evan Castillo is a reporter on BestColleges News and wrote for the Daily Tar Heel during his time at the University of North Carolina at Chapel Hill. He's covered topics ranging from climate change to general higher education news, and he is passiona...
Published on May 24, 2023
Edited by
portrait of Darlene Earnest
Darlene Earnest
Read Full Bio

Editor & Writer

Darlene Earnest is a copy editor for BestColleges. She has had an extensive editing career at several news organizations, including The Virginian-Pilot and The Atlanta Journal-Constitution. She also has completed programs for editors offered by the D...
Learn more about our editorial process
Image Credit: Bloomberg / Getty Images

  • The National Science Foundation granted the institute $20 million as one of seven new AI institutes.
  • TRAILS' mission encourages trustworthy AI systems by involving diverse stakeholders from creation to governance.
  • Lead director Hal Daumé said the institute plans to create and revitalize classes, undergraduate and graduate research opportunities, as well as boot camps, and opportunities for regional high school students.

As artificial intelligence (AI) becomes more apparent in our lives and higher education, one institute addresses its trustworthiness by involving diverse stakeholders from ideation to execution. Students at schools in the Institute for Trustworthy AI in Law & Society (TRAILS) can research and learn about participatory design in AI.

The National Science Foundation (NSF) announced seven new AI research institutes nationwide on May 4 — led by different higher education institutions — to harness opportunities and address risks with AI.

The NSF gave $20 million to fund TRAILS, whose mission is "... to encourage trustworthy artificial intelligence systems — iterations of AI that users, developers, and deployers see as accountable, responsible, and unbiased. However, the researchers at TRAILS believe that there is no trust or accountability in AI systems without participation of diverse stakeholders."

The lead institutions are the University of Maryland (UMD), The George Washington University (GW), and Morgan State University, one of the country's historically Black colleges and universities.

BestColleges spoke with Hal Daumé III, the lead director and principal investigator for TRAILS and a computer science professor at the University of Maryland, to find out what research and educational opportunities the institute will provide students.

What's Changing in AI

Daumé said that in the last 10 years, AI has increasingly, invisibly affected our lives — from Spotify playlist algorithms and map directions to the healthcare and judicial system.

He said there's been a surprising public interest in direct interaction between people and AI systems in the past year or two.

"There's much more of a direct interaction between people and AI systems that's less mediated by some fancy Netflix interface that shows you the recommendations in some pretty world."

He said this direct connection raises questions people haven't thought about much.

TRAILS seeks to address if AI systems deserve people's trust. Daumé said the belief is that people need to be involved at every stage of machine learning — from the initial idea to building the technology, to evaluating effectiveness, deploying it, and governing it.

The institute has come up with a set of methodologies and technologies that allows organizations to enable participatory design in tech without the institute's help through three main application types:

  • Dissemination Type Systems: Systems like your social media feed, ChatGPT, or the AI used in language translation apps
  • Physical Systems: Robots like civilian quadcopter drones, with less focus on humanoid robots
  • Larger Physical Infrastructure: Energy grids mediated by AI to decide when to conserve energy and how to get power back up as quickly as possible in an emergency

Changing How to Teach AI

Daumé wants to start implementing TRAILS' mission in the classroom by having guest lecturers from the institute teach modules using TRAILS principles. But he doesn't just want to add to the universities' curricula; he wants to rebuild them.

Daumé teaches an undergrad AI course at UMD based on a course syllabus from his friends at the University of California, Berkeley.

"It's really great, but it was also made in 2007," he told BestColleges. "And I mean, I was giving my students practice exams, and I was giving them practice exams back from 2010. And to be honest, not much had changed in the course material."

Daumé said TRAILS allows him to modernize intro AI undergraduate courses, particularly in AI or machine learning. There had been minimal discussion of AI's impact on society or fairness in automated systems in those courses; they're mostly relegated to classes like a computer science ethics course.

"My view is that the right way to do this is to integrate these questions into the basic material rather than having it as an add-on," he said.

Once he revamps the curriculum with technology and ethics updates, he wants to share the course worldwide as Berkeley did with him.

There will also be plenty of undergraduate research opportunities throughout the partner institutions targeted at students who want to start earlier in their academic careers.

Daumé said, from his experience, undergrads don't get involved in research until the summer before their senior year. While that's not too late, if students get involved earlier, they can determine if research is right for them and build a stronger portfolio.

The TRAILS team plans to have a shared research program between UMD, Morgan State, and GW that they hope to broaden nationally after the first few years.

The plan at the graduate level is "to try to train students to be good interdisciplinary researchers."

Graduate students can soon apply for fellowships supervised by faculty across multiple disciplines. Daumé gave a few examples: one could be an AI expert, one a governance expert, one a social scientist, and one a computer science expert.

The Pillars of Participatory Design

Daumé gave four ways TRAILS engages in participatory design.

Education and Workforce Development

Daumé said many people who have worked in AI and data science over the past decade weren't trained as computer scientists or statisticians as undergrads. TRAILS is addressing reskilling workers through boot camps while addressing AI and data science education at the undergrad level.

Broadening Participation

TRAILS wants to put more people in the field who don't look like everyone already there.

"One of our big hopes there is that by doing things at a participatory level, we can excite a lot of people about AI and involve a lot of people about AI that have been historically excluded from AI design development and so on."

Outreach

TRAILS is partnering with the Planet Word Museum in Washington, D.C., to broaden computing outreach and create summer research boot camps, AI summer programs for regional high school students, and student hackathons.

How the NSF Chose UMD, GW, and Morgan State

UMD proposed an idea to the NSF a few years ago, but the foundation denied it. Then the year after that, it was way too much work for Daumé to do again.

By the third year, he was on the fence about whether to apply. But David Broniatowski, the TRAILS site lead at GW, reached out to propose something together with Morgan State.

But they had to figure out their story, which would be participation in AI. They got into a "reverse site visit," which was a daylong interview with questions sent ahead to the interviewees.

"That was one of the more intense four or five hours of my life," Daumé told BestColleges. "And at the end of the day, it's like, OK, you have four hours to convince these people to give you 20 million, and it feels like everything you say is super-consequential."

Daumé believes the NSF chose to grant them $20 million instead of giving individual faculty or programs smaller grants because the foundation sees the institute as a whole greater than the sum of its parts.

He believes the NSF is looking for the institute to grow beyond the five years of funding and be a collaborative effort instead of a series of individual projects.

"Now we just have to figure out how to make it actually work," he said. "So everything's a learning experience, but I think that's why they care so much that it looks like we can work well together. Because I think they know that not everything is going to work as planned.

"When things don't occur according to plan, you can recover from that in a way that everyone will be on board with."