There is significantly more concern about artificial intelligence in 2023 than in 2021 before the emergence of ChatGPT, according to a new poll.
Research from nonprofit research organisation Sentience Institute found 23% trust AI companies to put safety over profits, and 27% trust the creators of an AI to maintain control of current and future versions.
This translates to widespread support for slowdowns and regulation, such as 63% support for banning artificial general intelligence that is smarter than humans, according to nationally representative surveys conducted by the nonprofit research organisation Sentience Institute.
People expect AI to come very soon. The median estimate for when AI will have general intelligence is only two years from now, and just five years for human-level AI, sentient AI, and superintelligence.
The prospect of sentient AI is particularly daunting as 20% of people think that some AIs are already sentient; 10% think ChatGPT is sentient; and 69% support a ban on the development of sentient AIs. If AIs become sentient, a surprisingly large number of people think we should take at least some steps to protect their welfare – 71% agree that sentient AIs deserve to be treated with respect, and 38% are in favour of legal rights.
"Based on preregistered predictions for multi-item measures in the survey, we found surprisingly high moral concern for sentient AI and a surprisingly high view of them as having a mind (i.e., mind perception)," Sentience Institute says.
"We also found significant increases from 2021 to 2023 in moral concern, mind perception, perceived threat, and support for banning sentience-related AI technologies. Two single-item measures also showed significantly shorter timelines for sentient AI from 2021 to 2023."
This provides landmark public opinion data from before to after 2022, a major year for AI in which people like Google engineer Blake Lemoine raised the possibility that current AIs may be sentient, and groundbreaking AI systems were launched such as Stable Diffusion and ChatGPT.
Additional results for the most recent 2023 data include:
- 71% support government regulation that slows AI development.
- 39% support a bill of rights that protects the well-being of sentient robots/AIs.
- 68% agreed that we must not cause unnecessary suffering to large language models (LLMs), such as ChatGPT or Bard, if they develop the capacity to suffer.
- 20% of people think that some AIs are already sentient; 37% are not sure; and 43% say they are not.
- 10% of people say ChatGPT is sentient; 37% are not sure; and 53% say it is not.
- 23% trust AI companies to put safety over profits; 29% are not sure; and 49% do not.
- 27% trust the creators of an AI to maintain control of current and future versions; 27% are not sure; and 26% do not.
- 49% of people say the pace of AI development is too fast; 30% say it's fine; 19% say theyre not sure; only 2% say it's too slow.
"Our unique time series data on what the general public thinks about AI rights and the risks that AI entails gives us a new tool to address their growing concerns," says Sentience Institute research fellow Janet Pauketat, who led the survey project.
"The public wants more safety and caution than we currently see in the AI industry, and they are not indifferent to a future that incorporates AI interests alongside human interests," she says.
Jacy Reese Anthis, co-founder of the Sentience Institute, adds, "We were very surprised by the moral concern people had for AIs and the extent to which they see them as having a mind.
"Whether or not any AIs today are actually sentient, the mere act of considering the possibility is already changing how we interact with machines," Anthis says.
"They are no longer just technological tools, but digital minds who are becoming part of our social fabric. Were conducting research to lay a foundation for understanding that new world and improving the future for all sentient life."
The data was collected in three nationally representative survey waves. A set of 86 questions were asked of 1,232 U.S. adults from November to December 2021 and an independent sample of 1,169 from April to June 2023. Another 1,099 were asked 111 related questions in a supplemental survey from June to July 2023. Margins of error were approximately +/- 3%.