Yet such advances raise serious ethical questions and concerns about potential misuse.
Researchers from Google and Stanford University have demonstrated that just a two-hour conversation with an artificial intelligence model can create a strikingly accurate replica of a person's personality. Published on November 15 in the arXiv preprint database, the study presents "simulation agents" - AI models designed to mimic human behavior with remarkable accuracy.
Led by Joon Sung Park, a PhD student in computer science at Stanford, the research involved in-depth interviews with 1,052 participants. These interviews cover personal stories, values, and opinions on societal issues, forming a dataset for training AI generative models. The participant group was intentionally diverse in age, gender, race, region, education, and political ideology, ensuring a broad representation of human experience.
To assess accuracy, participants completed two rounds of personality tests, social surveys, and logic games, repeating the process after a two-week break. The AI replicas then went through the same tests, mirroring the responses of their human counterparts with an astonishing 85 percent accuracy rate.
"If you can get a bunch of little 'you's' moving around and actually making the decisions that you would make - that, I think, is ultimately the future," Park told MIT Technology Review.
The researchers envision these artificial intelligence models revolutionizing scientific research by simulating human behavior in controlled environments. Applications could range from evaluating public health policies to assessing reactions to public events or product launches. They argue that such simulations offer a way to test interventions and theories without the ethical and logistical difficulties associated with using human participants.
However, these findings should be approached with a great deal of scepticism. Although AI clones are excellent at replicating responses to personality surveys and social attitudes, they are significantly less accurate at predicting behavior during interactive economic decision games. This discrepancy highlights the ongoing challenges AI faces in tasks that require understanding complex social dynamics and contextual nuances.
The evaluation methods used to verify the accuracy of AI agents are also relatively rudimentary. Instruments such as the General Social Survey and Big Five personality trait assessments, while standard in social science research, cannot fully capture the complex layers of human personality and behavior.
Ethical issues further complicate the implications of technology. In an era where AI and "deep faking" technologies are already being used to manipulate and deceive, the introduction of highly personalized AI replicas is cause for concern. Such tools could potentially be used as weapons, which heightens privacy and trust risks.
Despite these reservations, the study introduces compelling opportunities for future research, notes John Horton, associate professor at the MIT Sloan School of Management.
"This paper shows how one can do a kind of hybrid: use real people to generate personalities that can then be used programmatically/simulatively in ways that could not be used with real people," he says.
The effectiveness of the interviewing process in capturing individual nuance is particularly striking. Park emphasized the depth of insights that a two-hour conversation can provide, citing his experience with podcast interviews.
"Imagine someone just had cancer, but last year was finally cured. That's very unique information for you that says a lot about how you might behave and think about things," he said.
This innovation has piqued the interest of companies already developing digital twin technology. Hassan Raza, CEO of Tavus - a company that specializes in creating AI replicas from customer data - expressed enthusiasm for this streamlined approach.
"How about just talking to an AI interviewer for 30 minutes today, 30 minutes tomorrow? And then we'll use that to create your digital twin." he says. | BGNES