Skip to main content

New book by Concordia prof explores artificial intelligence from a psychological standpoint

Jordan Richard Schoenherr unpacks how technology is changing how we live and make decisions
July 29, 2022
Headshot of man smiling with mouth closed, wearing dark blue blazer
Jordan Richard Schoenherr: “When it comes to controlling your level of AI entanglement, you should make the effort.”

Artificial intelligence (AI) is all around us, and its influence is constantly increasing.

A timely new book from Jordan Richard Schoenherr, assistant professor of psychology in Concordia’s Faculty of Arts and Science, considers the consequences of AI use on individuals and society. Ethical Artificial Intelligence from Popular to Cognitive Science uses an interdisciplinary lens to examine social and ethical issues associated with AI in what he calls the “Age of Entanglement.”

“I’m an experimental psychologist. My background is in judgment and decision-making,” Schoenherr says.

When he started teaching social psychology classes in 2014, Schoenherr sought practical ways to engage his students. “I became interested in cyberpsychology as a hook,” he says.

“People want to understand how these complicated technologies affect their thoughts, emotions and behaviours. Students want to understand how psychology can be applied to contemporary issues faced by society.”

The problematic side of technology

The work led to Schoenherr’s book, in which he explores topics including AI in popular culture, how companies nudge you to purchase certain items in stealthy ways — known as dark patterns — and how people’s cultural values can influence how inclined they are to question technology.

Schoenherr wants to inspire readers to think harder about the problematic side of technology and how it influences our lives and social organizations.

“We frequently ignore how technology is affecting us. Most people have no idea how pervasive this is in our daily life.”

Schoenherr explains how day-to-day entanglement can create “familiarity effects,” which lead people to increasingly trust things they are exposed to on a regular basis, ignoring complicated or negative elements arising from their choices.

Fuelling conspiracy theories

Confirmation bias is another problem, he adds, as individuals are more and more likely to go online seeking only those opinions that match their own without employing critical thought. This can lead to ill-advised decisions and even fuel conspiracy movements.

“We live in a post-truth era,” Schoenherr says, noting that recent movements against masks and vaccines are a natural outcome of people who are selective in their social media consumption.

“A recent example is the Ottawa ‘Freedom Convoy.’ Here, we might start with a small group that has an inkling to do something but might have been goaded by individuals who had other motivations, using misinformation and disinformation, creating a snowball,” he says.

“This is further fuelled by algorithms that can selectively present information to users.”


While unplugging completely is the simplest choice, it isn’t practical for everyone. Schoenherr encourages people to pay closer attention to the technologies they are using.

“For example, there are many privacy controls on Facebook, and most people don’t realize they have these options. This creates dark patterns that can lead to more disclosure or a failure to recognize what kinds of data are being collected,” he notes.

“When it comes to controlling your level of AI entanglement, you should make the effort. We need to make greater efforts to incorporate our values and ethics into the design, operations and use of AI.”

Learn more about Concordia’s Department of Psychology.


Back to top

© Concordia University