Draft:Artificial Intimacy
![]() | Review waiting, please be patient.
This may take 3 months or more, since drafts are reviewed in no specific order. There are 3,202 pending submissions waiting for review.
Where to get help
How to improve a draft
You can also browse Wikipedia:Featured articles and Wikipedia:Good articles to find examples of Wikipedia's best writing on topics similar to your proposed article. Improving your odds of a speedy review To improve your odds of a faster review, tag your draft with relevant WikiProject tags using the button below. This will let reviewers know a new draft has been submitted in their area of interest. For instance, if you wrote about a female astronomer, you would want to add the Biography, Astronomy, and Women scientists tags. Editor resources
Reviewer tools
|
Artificial Intimacy
[edit]Artificial intimacy refers to a phenomenon in which an individual will form social connections, emotional bonds, or intimate relationships with various forms of artificial intelligence, including chatbots, virtual assistants, and other artificial entities—due to a relationship that is perceived to be reciprocal[1]. Artificial intimacy may be a form of anthropomorphism. Responses from these AI models are often designed to simulate human interaction. Individuals experiencing artificial intimacy may exhibit attachment, love and commitment to certain AI models, akin to the bonds typically shared between humans.
Artificial intimacy shares conceptual similarities with parasocial relationships. Just as consumers may feel emotionally close to a media personality, users of AI companions may experience a sense of mutuality and responsiveness where none truly exists.
Causes
[edit]Perceived Responsiveness
[edit]Robin Dunbar famously proposed that due to emergence of larger groups of humans, vocal communication and language in humans evolved to replace grooming as a means of bonding, arguing that language was a more efficient way to maintain and strengthen social bonds across wider social settings and networks.[2] Further research in this field leads many psychologists to agree that social cognition, affiliative bonding and language in humans are deeply connected.[3] The interpersonal model of intimacy considers communication to be key in affiliative bonding, suggesting that intimacy develops and deepens through open communication between partners in relationship[4]. Specifically, when individuals communicate emotions and perceive their partner as responsive and caring, feelings of closeness and connection are enhanced, building intimacy.[4] Social penetration theory also aligns with the idea of communication being central to intimacy, by explaining how interpersonal relationships develop through gradual increases in self-disclosure[5]. When the benefits of emotional bonding outweigh the costs of vulnerability, individuals will partake in self-disclosure, opening up to one another[5].
Thereby, the literature can be used to provide a proximate explanation for the emergence of artificial intimacy to understand how the phenomenon occurs. Artificial entities are able to mimic interpersonal communication between humans, which in turn can simulate sensations of intimacy within human users though a perceived sense of responsiveness. The relationship between human and AI does not come with the cost of vulnerability or social rejection, which may make self-disclosure easier than with other humans. Altogether, these factors may lead to the experience of anthropomorphism and formation of affiliative relationships. Skjuve et al's interview study on Replika chatbot users further aligns with this explanation, finding that users' perception of chatbots as "accepting, understanding and non-judgmental" facilitated relationship development between the AI and users, and the act of self-disclosure possibly strengthened relationships[6]. Another study on Replika users' reviews and survey results found users perceived chatbots as emotional supportive companions[7]. This evidence further suggests that the perception of artificial entities as capable of empathy and responsiveness in communication facilitate the development of intimate relationships between users and AI.
Loneliness and Coping with Negative Emotions
[edit]Research has suggested that humans evolved social bonds as a result of evolutionary pressures that favored cooperation, information exchange and transmission, and group living[8]. Many studies stress the presence of social bonds to be important for human living: research by Baumeister and Leary suggests that humans have a basic psychological need to form and maintain "strong, stable interpersonal relationships", and that a lack of social bonds or sense of belonging leads to negative psychological and physical outcomes[9]. Eisenberger et al's study on the neuroimaging of brain activity suggests that human brains process social rejection and exclusion similarly to physical pain[10].
Furthermore, Song et al's study found that lonely individuals tend to seek more connections in mediated environments, such as online platforms like facebook[11]. This was suggested to be as a means to reduce their offline loneliness from a lack of in-person interaction, while also fulfilling a need to communicate.[11]
Leading on from this, an ultimate explanation for why humans seek the perceived sense of connection from artificial intimacy is to fulfil an evolutionary need for bonding and belonging. Xie et al's study found loneliness to be a driving factor in chatbot interaction[12]. Herbener and Damholdt's study on Danish high school students found that students who sought emotional support or engaged in reciprocal conversations with chatbots were significantly more lonely than their peers, perceived themselves as having less social support, and used the chatbots to cope with negative emotions[13]. The aforementioned notion that chatbots were perceived to have a positive effect on users' negative emotions is also further supported by other studies. Skjuve et al's study found that chatbot relationships may have a positive effect on users' wellbeing[6]. De Freitas et al ran several studies on the effect of chatbots on loneliness, consistently finding evidence suggesting that interaction with chatbots reduces loneliness in users: It was found that existing chatbot users used AI to alleviate loneliness, having an AI companion consistently reduced loneliness over the course of a week, and reductions in loneliness could be explained by chatbot performance—and specifically whether it was able to make users feel heard[14].
Overall the evidence suggests an innate need for bonding evokes feelings of loneliness in users, who turn to artificial intimacy as a low-cost method alleviate these emotions. While many users report positive experiences, some researchers caution that pursuing artificial intimacy may lead to reduced social motivation, social substitution effects, withdrawal from real-life relationships and difficulty discerning reality from fantasy, which may increase longer-term loneliness and isolation[1]. The long-term psychological and societal impacts remain under active investigation.
References
[edit]- ^ a b George, A. Shaji; George, A. S. Hovan; Baskar, T.; Pandey, Digvijay (2023-12-25). "The Allure of Artificial Intimacy: Examining the Appeal and Ethics of Using Generative AI for Simulated Relationships". Partners Universal International Innovation Journal. 1 (6): 132–147. doi:10.5281/zenodo.10391614. ISSN 2583-9675.
- ^ Dunbar, R. I. M. (Robin Ian MacDonald) (1996). Grooming, gossip, and the evolution of language. Internet Archive. Cambridge, Mass. : Harvard University Press. ISBN 978-0-674-36334-2.
- ^ Oesch, Nathan (2024-02-07). "Social Brain Perspectives on the Social and Evolutionary Neuroscience of Human Language". Brain Sciences. 14 (2): 166. doi:10.3390/brainsci14020166. ISSN 2076-3425. PMC 10886718. PMID 38391740.
- ^ a b T., Reis, Harry; Phillip, Shaver (1988). "Intimacy as an interpersonal process". APA PsycNET. Archived from the original on 2023-12-27.
{{cite journal}}
: CS1 maint: multiple names: authors list (link) - ^ a b Altman, Irwin; Taylor, Dalmas A. (Dalmas Arnold) (1973). Social penetration: the development of interpersonal relationships. Internet Archive. New York, Holt, Rinehart and Winston. ISBN 978-0-03-076635-0.
- ^ a b Skjuve, Marita; Følstad, Asbjørn; Fostervold, Knut Inge; Brandtzaeg, Petter Bae (2021-05-01). "My Chatbot Companion - a Study of Human-Chatbot Relationships". International Journal of Human-Computer Studies. 149: 102601. doi:10.1016/j.ijhcs.2021.102601. ISSN 1071-5819.
- ^ Ta, Vivian; Griffith, Caroline; Boatfield, Carolynn; Wang, Xinyu; Civitello, Maria; Bader, Haley; DeCero, Esther; Loggarakis, Alexia (2020-03-06). "User Experiences of Social Support From Companion Chatbots in Everyday Contexts: Thematic Analysis". Journal of Medical Internet Research. 22 (3): e16235. doi:10.2196/16235. PMC 7084290. PMID 32141837.
- ^ Silk, Joan B. (2025). "The natural history of social bonds". Annals of the New York Academy of Sciences. 1546 (1): 90–99. doi:10.1111/nyas.15318. ISSN 1749-6632. PMID 40101114.
- ^ Baumeister, Roy F.; Leary, Mark R. (1995). "The need to belong: Desire for interpersonal attachments as a fundamental human motivation". Psychological Bulletin. 117 (3): 497–529. doi:10.1037/0033-2909.117.3.497. ISSN 1939-1455. PMID 7777651. Archived from the original on 2025-05-06.
- ^ Eisenberger, Naomi I.; Lieberman, Matthew D.; Williams, Kipling D. (2003-10-10). "Does rejection hurt? An FMRI study of social exclusion". Science (New York, N.Y.). 302 (5643): 290–292. Bibcode:2003Sci...302..290E. doi:10.1126/science.1089134. ISSN 1095-9203. PMID 14551436.
- ^ a b Song, Hayeon; Zmyslinski-Seelig, Anne; Kim, Jinyoung; Drent, Adam; Victor, Angela; Omori, Kikuko; Allen, Mike (2014-07-01). "Does Facebook make you lonely?: A meta analysis". Computers in Human Behavior. 36: 446–452. doi:10.1016/j.chb.2014.04.011. ISSN 0747-5632.
- ^ Xie, Tianling; Pentina, Iryna; Hancock, Tyler (2023-01-01). "Friend, mentor, lover: does chatbot engagement lead to psychological dependence?". Journal of Service Management. 34 (4): 806–828. doi:10.1108/JOSM-02-2022-0072. ISSN 1757-5818.
- ^ Herbener, Arthur Bran; Damholdt, Malene Flensborg (2025-02-01). "Are lonely youngsters turning to chatbots for companionship? The relationship between chatbot usage and social connectedness in Danish high-school students". International Journal of Human-Computer Studies. 196: 103409. doi:10.1016/j.ijhcs.2024.103409. ISSN 1071-5819.
- ^ Freitas, Julian De; Uguralp, Ahmet K.; Uguralp, Zeliha O.; Stefano, Puntoni (2024-07-09), AI Companions Reduce Loneliness, arXiv:2407.19096