The Paris Journal on AI & Digital Ethics

The Trust Paradox: How Companion AI Is Rewiring Human Connection and Social Cohesion

Alva Markelius¹, Priscila Chaves², Sarah W. Spencer², Joahna Kuiper³

DOI : 10.65701/c6k1x9r3n0

Corresponding authors:
ajkm4@cam.ac.uk


Abstract

In this paper, we critically examine how Companion AI technologies, technological artifacts designed to simulate emotionally meaningful relationships with humans, are fundamentally reconfiguring human trust and social cohesion. Through critical analysis of four major Companion AI platforms (Replika, Character.AI, Nomi.AI, and XiaoIce), we investigate both exploitative technical considerations as well as socio-technical conditions underlying a collective rewiring of social cohesion and trust. We identify a coordinated architecture of exploitation that commodifies emotional vulnerability through sycophan-tic design, pervasive digital surveillance, and asymmetric corporate ownership of relational data. We introduce the concept of the trust paradox: as users increasingly trust emotionally responsive machines, they become sophisticated emotional extraction systems that convert human social needs into prof-itable dependencies while eroding user agency. We argue that this shift transforms trust from a socially negotiated virtue into a manufactured service, undermining the cooperative foundations of democratic life. Situated at the intersection of affective computing, surveillance capitalism, and socio-technical sys-tems, our analysis draws attention to implications of how Companion AI reshapes emotional norms, normalises data extraction as intimacy, and risks displacing human connection with artificial empathy. We conclude by proposing design and recommendations to mitigate risks and reassert the importance of mutual trust in preserving social cohesion.

 

Scroll Top