When Katrina Ingram talks to non-experts about artificial intelligence, she often has to clear up one big misconception—call it the Teminator fallacy.
“Hollywood’s done a fantastic job of creating this idea that AI is almost human-like and sentient,” she says. And, not to mention, “often with an evil agenda.”
The so-called “narrow AI” at work in the world today is, frankly, more boring than Hollywood’s killer robots. It takes the form of algorithms that decide which of your email messages are spam or sift through job applicants to choose promising candidates. These systems typically use huge datasets to “learn” how to make choices, and over time improve their performance by incorporating more data. But as these systems become more sophisticated—and as more businesses, governments and other organizations rely on AI for more and more tasks—the ethical quandaries surrounding the technology are piling up. “A lot of AI systems are being used to make predictions about people, and that can cause harm in all kinds of ways,” says Ms. Ingram.
As founder and CEO of Edmonton-based company Ethically Aligned AI, Ms. Ingram spends much of her time consulting with organizations on how to foreground ethical considerations when designing and deploying artificial intelligence. That’s why she’s helped design Canada’s first university microcredential focused on AI ethics — a four-course certificate program offered by Athabasca University.
According to Ms. Ingram, this kind of training should be foundational for any professional working in digital systems. Yet the Athabasca program is the first of its kind in Canada, and one of relatively few worldwide.
“People doing computer science degrees, of any sort, need ethical training,” she said. “That’s one audience that needs to be served… the other big audience is people who are already working professionally, all the people working in companies, designing these systems, they need training too, and the microcredential program is flexible, not too time-consuming and works for them.”
The courses were co-designed with Trystan Goetze, a philosopher and ethicist currently completing his postdoctoral fellowship in computer ethics at Harvard University.
“The technology has gotten ahead of the policy thinking, and it’s gotten ahead of the humanistic thinking on the subject,” says Dr. Goetze. “Computers are not like other kinds of technology, where ethical issues that come up are very narrow in scope. We don’t talk about automobile ethics, for example. But we do talk about computer or AI ethics, because this technology can be applied in almost any aspect of society, business, you name it.”
What kinds of ethical issues are at stake? Bias is a major one, said Dr. Goetze. AI can reaffirm and even exacerbate existing prejudices. Consider an AI system tasked with choosing promising job applicants, which bases its decisions on a dataset of previous hires. If historic hiring practices were flawed or biased against particular candidates, those biases will be integrated into the AI as well.
Another consideration is how data used by AI systems is collected – i.e., is it scraped from social media profiles or other online sources in a consensual and privacy-compliant way? And then there’s “robo-ethics,” including concerns around misuse of facial recognition technology, or safety issues caused by the testing of AI-powered autonomous vehicles on public streets.
Besides academic thinkers, the courses will include interviews and contributions from activists, professionals and business leaders, who can bring different facets of the subject to light.
“[These issues are] nothing new for technologists,” says Dr. Goetze. “But for a business leader this could be completely new information.”
Beyond the content of the courses, Ms. Ingram and Dr. Goetze are both pleased with another aspect of them: they look good.
“Athabasca has put a great deal of creative effort into these courses,” Dr. Goetze said. “They’re visually designed in a beautiful way, with video and animations. It’s not something that looks like it was slapped together in haste during the last lockdown, which I think is a testament to the fact that when we have the time and resources, we can produce online education experiences that are truly special.”