Chatbots Can Be Trained to Represent Humans after Death: Interview with ChatGPT

16 Pages Posted: 7 Apr 2023

See all articles by Kadir Uludag

Kadir Uludag

Shanghai Jiao Tong University (SJTU) - Shanghai Mental Health Center

Date Written: March 25, 2023

Abstract

Background: Chatbots such as ChatGPT simulate conversations with human users. Previously, chess bots have trained to represent chess players. Moreover, with advances in language processing, chatbots have become capable of replicating human-like conversations. One area of interest is the skill of training chatbots to represent humans by mimicking their communication. By mimicking them, chatbots can be created to represent a particular person and can be used to reduce people's sorrow after death.

Methods: An interview was conducted with ChatGPT to investigate the topic of training chatbots to represent humans. Next, we provided a personal message to allow ChatGPT to respond to the answer accordingly.

Results: ChatGPT provided insights into the potential of training chatbots to represent humans. However, it also raised critical ethical considerations like privacy, consent, and deception.

Conclusion: ChatGPT provided insights into the potential of training chatbots to represent humans. Evaluating the ethical implications of using personal communication data to train them is vital.

Keywords: ChatGPT, human psychology, mimicking humans, mimicking writing

Suggested Citation

Uludag, Kadir, Chatbots Can Be Trained to Represent Humans after Death: Interview with ChatGPT (March 25, 2023). Available at SSRN: https://ssrn.com/abstract=4399732 or http://dx.doi.org/10.2139/ssrn.4399732

Kadir Uludag (Contact Author)

Shanghai Jiao Tong University (SJTU) - Shanghai Mental Health Center ( email )

Do you have negative results from your research you’d like to share?

Paper statistics

Downloads
57
Abstract Views
331
Rank
659,215
PlumX Metrics