Artificial intelligence is truly all around us. And we’ve become so accustomed to using it, that we barely even notice it anymore. It’s there when we ask our smart speakers a question, it’s how we get recommendations on the next song to play, it helps filter spam from our inboxes. But as AI plays a larger role in how we live and work, so do its inherent biases.
A recent survey by Borealis AI, RBC’s AI research institute, found that 88% of companies believe bias exists in their organization. Yet almost half (44%) don’t understand the ethical challenges that bias presents in AI.
A surge in data use and biometrics during the COVID-19 pandemic has only heightened the need to examine these questions.
“AI amplifies the power imbalance in the people and organizations that are producing data on the population versus the population’s ability to understand that they’re being watched and surveilled,” said Ruha Benjamin, a sociologist and associate professor of African American studies at Princeton University, who was a guest on a recent Disruptors podcast about the ethics of AI.
So what do we need to think about, as we grow more dependent on artificial intelligence?
AI amplifies bias
There is a significant human element to AI. The technology changes the ways we live and work, but we also shape it. It is humans who write the algorithms after all and AI-generated outcomes are only as good as the data that goes in. For instance, some of the original facial recognition systems and algorithms contained ethnic bias. They performed with high levels of inaccuracy for non-white faces – due largely to input data that skewed to white males. According to a recent BBC investigation, photos of women with the darkest skin were four times more likely graded “poor quality” than those of women with the lightest skin.
“AI makes bad things really bad very quickly. And that’s the risk that we have to manage and mediate,” said Saadia Muzzafar, a Canadian entrepreneur, author and the founder of Tech Girls Canada, a nonprofit created to promote women in science, technology, engineering and math.
The outcomes raise questions about what it means to be human and what our relationship is with these machines. The algorithms don’t care who we are – when done properly, AI can help capture a wide range of global perspectives, if the quality and equity is in the data.
Responsible AI can exist
Many Canadian companies are uncertain about how to use AI responsibly. A recent report from RBC Thought Leadership on the usage of facial recognition technology found six in 10 Canadian businesses feel that AI is mostly for larger organizations.
But its uses are increasingly everywhere. Regardless of the size of the task, businesses developing AI should avoid working in isolation. Canada is home to the world’s leaders in developing ethical AI. It’s here that the Montreal Declaration for the Responsible Development of AI was signed, the Privacy by Design certification was developed, and CIFAR’s AI & Societyprogram was born. In this spirit, RBC and Borealis AI have launched RESPECT AI, a hub for firms to gain practical solutions for the responsible adoption of AI.
“In Canada we can set the bar higher – and hold ourselves to a higher standard. We need to look at this as the long game,” said Muzzafar.
We humans do, in fact, know how to control this technology – AI is working as it’s designed. We just need to design it better.
This article is intended as general information only and is not to be relied upon as constituting legal, financial or other professional advice. A professional advisor should be consulted regarding your specific situation. Information presented is believed to be factual and up-to-date but we do not guarantee its accuracy and it should not be regarded as a complete analysis of the subjects discussed. All expressions of opinion reflect the judgment of the authors as of the date of publication and are subject to change. No endorsement of any third parties or their advice, opinions, information, products or services is expressly given or implied by Royal Bank of Canada or any of its affiliates.