Passer directement au contenu principal

We'd prefer it if you saw us at our best.

Pega.com is not optimized for Internet Explorer. For the optimal experience, please use:

Close Deprecation Notice

Study Finds Consumers Conflicted on Use of Artificial Intelligence

Research highlights confidence in AI to improve customer experience even as broader trust issues linger

LAS VEGAS. – June 13, 2023 Consumers have confidence in artificial intelligence (AI) as a tool for transforming their customer experiences but have growing concerns over AI’s increased prevalence in other disciplines, according to new research from Pegasystems Inc. (NASDAQ: PEGA), the low-code platform provider empowering the world’s leading enterprises to Build for Change®. The study, conducted by research firm Savanta and unveiled at PegaWorld® iNspire, the company’s annual conference in Las Vegas, surveyed 5,000 consumers worldwide on their views around AI, its continued evolution, and the ways in which they interact with the technology. 

It found a general acceptance of AI in areas relating to customer experience, with two thirds (67%) of respondents agreeing AI has the potential to improve the customer service of businesses they interact with, and more than half (54%) saying companies using AI will be more likely to offer better benefits to customers compared to businesses that do not. Meanwhile, nearly half of respondents (47%) indicated they are comfortable interacting with well-tested AI services from businesses, and two thirds (64%) said they expect most major departments within organizations will be run using AI and automation within the next ten years.

Despite this, the research also highlighted a major lack of trust in AI in several areas, including:

  • A preference for people: Despite demonstrating an appetite for the use of AI in customer engagement, 71% of respondents said they still prefer to interact with a human being than the AI itself. The vast majority (68%) also said they would trust a human bank employee to make an objective, unbiased decision about whether to give a bank loan more than an AI solution, while the overwhelming majority (74%) admitted they would trust a medical diagnosis from a human doctor than one made by AI with a better track record of being right, but which could not demonstrate or explain how it arrived at its decision. Meanwhile, despite 51% saying they think an autonomous car is capable of making a more ethical decision than a human driver might in avoiding a car crash situation, 65% agreed that AI should not be allowed to overrule a human driver in such a situation. 
  • The rise of the machines: The vast majority (86%) of respondents said they feel AI is capable of evolving itself to behave amorally, with more than a quarter (27%) saying they think this has already happened. Almost half (48%) said it was likely that generative AI will eventually become sentient or self-conscious. Almost a third (30%) said they were concerned about AI enslaving humanity – a small increase from the 27% who said the same in a similar study conducted in 2019. Only 16% said they had no concerns over AI whatsoever.
  • Reality check: While the study points to an increased general awareness of AI as a tool for everyday use – more than half of respondents said they think the technology is now responsible for producing more than half of all photos (55%) and videos (55%) they consume – concerns are building over how challenging it is to tell what’s real from what’s fake, with the majority indicating some level of difficulty in determining whether content has been generated by humans or AI. Nearly two-thirds (63%) indicated they couldn’t tell whether a long-form article had been generated by AI or a human, while a similar number said the same about photos (59%) and videos (58%). More than half (56%) said it was difficult to tell if AI had generated TV reports they consume.

Quotes & Commentary:

“As applications like Midjourney and ChatGPT bring AI to the masses, it’s no surprise that we’re seeing a degree of conflict. Let’s not forget, many people are already accepting the benefits this technology can bring; after all, asking Alexa or Siri a question is nothing new for most consumers,” said Dr. Rob Walker, general manager, 1:1 customer engagement, Pega. “However, it’s also perhaps inevitable that as the spotlight on this technology intensifies, so does the level of fear and uncertainty around some of the more science-fiction influenced ‘doomsday scenarios’ surrounding it. As these concerns grow, the need for those organizations to demonstrate greater transparency in the outcomes these AI systems produce, and perform ethical bias tests to check how it ‘behaves’ at all times becomes clear. 

“What we’re seeing is that while people seem more comfortable than ever using AI, they’d rather hold it at arm’s length when it comes to dealing with big, impactful decisions. Consumers are still expressing a strong desire to retain human interactions as a key part of the way they interact with organizations. What this tells us is that people are important, and that consumers want human beings in the loop at all times. The best way to embrace technologies like AI is to use them to supplement and augment existing human skills. Businesses that can do this effectively will be able to reap the benefits, keep their customers happy, and maximize their productivity.”

Notes

Pega surveyed 5,000 global consumers on their views on artificial intelligence. The results included responses from the United States, the United Kingdom, France, Australia, and Japan.


Partager cette page Share via X Partager via LinkedIn Copie en cours...

Lisa Pintchman
VP, Corporate Communications
[email protected]
+1 617-866-6022

North America

Sean Audet
Director, Corporate Communications
[email protected]
+1 617-528-5230

Ilena Ryan
Sr. Manager, Public Relations
[email protected]
+1 617-866-6722

Europe

Joanna Richardson
Director, Corporate Communications
[email protected]
+44 (0) 118 9651 660

Jon Brigden
PR & Communications Manager
[email protected]
+44 (0) 118 9398 584

Partager cette page Share via X Partager via LinkedIn Copie en cours...