The Paris Conference
The Paris Conference on AI & Digital Ethics is a cross-disciplinary and cross-sectoral meeting platform welcoming academics from various disciplines and stakeholders in the development of digital technologies from the industry, civil society, and politics. Specialists from the humanities, social sciences and computational sciences are invited to combine their methods to examine the changes taking place in our societies and to steer the development of technologies towards the common good.
The Paris Conference pursues two goals. As a research conference, it aims to advance the state of the art of research on the ethical, societal, and political implications of AI and digital technologies. As a meeting platform, it offers a space for the international community of experts to gather, foster an open dialogue on major issues underlying the development of sociotechnical systems, and collaborate to address these issues.
Important information
The conference will take place at the International Conference Centre of Sorbonne University, Paris, on June 16th and 17th, 2025. Submissions are welcome until March 15th, 2025 and the authors whose papers are selected will be notified by April 15, 2025. They will be invited to present their paper at the conference and to publish it in the conference’s academic journal.
The conference particularly welcomes transdisciplinary approaches, hybrid methods and cross-sectorial collaborations. At least the main author should hold a Ph.D degree, or be a PhD candidate, and have a research function in either an academic or private organisation.
When submitting an abstract, all contributors acknowledge that they are expected to physically attend the conference to present their paper, should it be selected, and agree to provide the organisers with a full version of their paper by July 31st, 2025.
Scope
Academic and industry researchers are invited to submit a contribution in one of the five following disciplines or related disciplines.
• Computational philosophy and social sciences
• Ethics and moral philosophy
• Political theory and science
• Statistics and computer sciences
• International relation
• Law
The focus of this 3rd edition is on the threats to political systems and emerging solutions to rebuild trust in more resilient societies. Both fundamental research with foreseeable implications and more applied research with direct impacts on public and private organisations are welcome. Proposals should address topics related to this theme and be explicitly linked to one of the four following tracks.
1. Controlling cyber-influence and manipulation strategies: The digitalisation of societies, organisations and social interactions encouraged the development of a wide range of strategies to persuade, manipulate and influence individuals and populations. This track welcomes contributions that investigate all kinds of techniques (e.g., persuasive technologies/captology, nudging, psychological conditioning, subliminal stimuli, chatbots) that exploit human psychology and cognitive vulnerabilities to modify individuals’ beliefs, behaviours and attitudes at the individual or collective level. Typical contributions may examine the ways social influencers grow and use their influence on social media, the design (e.g., affordance) and psychological mechanisms (e.g., attention capture and distraction) digital platforms leverage to influence users’ behaviour, cyber-fighters’ strategies to bypass security protocols playing on human vulnerabilities, and sophisticated strategies to influence public debates and political elections. Papers examining practical use cases, documenting emerging techniques or exploring futuristic and high-potential strategies are welcome, as well as those which propose methodologies and metrics to assess the performance of such techniques and ethical frameworks to govern them.
2. Countering information manipulation techniques: In the context of rising tensions between populations, information has become a key instrument of power to discredit political, ideological and institutional opponents. This track welcomes contributions that investigate the challenge of informing people in this new era, examining the challenge of moderating online misinformation and exploring novel solutions to rebuild social trust around common references. Typical contributions may document emerging techniques and coordinated strategies to spread disinformation on social media, including generative AI solutions, examine the psychology of key misinformation actors involved in such a process (e.g., producers, spreaders, believers, gatekeepers), assess the impact of disinformation campaigns on polarising populations or supporting political decisions, suggest original moderation frameworks for social media platforms, such as community review systems, and examine the mutations of the information providers market, questioning the future of journalism.
3. Exploring the potential of blockchain to renew civic trust: The continuing erosion of people’s trust in their institutions and governments is one of the greatest challenges of our era and many see blockchain architectures as either a way to renew such trust or to transition towards a society where the necessity of trust disappeared. This tracks welcome contributions that investigate how blockchain solutions could help address the current challenges faced by the AI ethics community to ensure reliance on AI systems and renew civic trust in institutions. Typical contributions may include practical use cases and prospective reflections on the potential of blockchain protocols and distributed governance models, including relevant incentives and decision-making processes, to increase transparency in the development and auditing of AI systems, to organise collective work, reward contributors and distribute intellectual property, to enable deliberative process and voting architectures capable of contributing to the renewing of trust in our societies.
4. Rebuilding social cohesion in AI-powered societies: In addition to suffering from a civic trust crisis, societies are increasingly divided among populations between which communication erodes and violence increases. This track welcomes contributions that investigate the civilizational causes of such issues, the impact of digital technologies and the potential of original technologically-mediated solutions to reverse these. Typical contributions may include theoretical work on the anthropological roots of distrust and violence escalation in modern society, original communication paradigms enabled by digital and AI solutions to encourage more respectful and empathetic interactions on social media, evidence-based techniques to reduce hate speech and de-escalation processes in social environments, as well as alternative algorithmic recommendation systems to orientate people’s attention towards meaningful content in a digital environment.
We invite researchers to submit a 500-word abstract in English, followed by a short bibliography (5 references maximum), by March 15, 2025, at 11:59 p.m. CET. Abstracts should be submitted on the conference’s website (here) and comply with this Word template.
Abstracts submitted via a different channel or failing to comply with the templates’ format will not be considered.
Authors of the selected abstracts will be responsible for submitting the full version of their paper (5,000 words, ± 10%), incorporating feedback from the discussions at the conference. Final papers will be submitted on the conference’s website by July 31th, 2025, 11:59 p.m. CET.