"We don't need new neurorights, we should develop the ones we have"
Jan Christoph Bublitz, jurist and keynote speaker at the IDP Congress

Jan Christoph Bublitz is a legal scholar at the University of Hamburg who specializes in criminal and human rights law, legal theory, and the intersections of law, philosophy, and cognitive science. He currently leads two interdisciplinary research projects: Hybrid Minds, which examines legal and ethical implications of merging artificial and organic intelligence through neurotechnologies, and another focused on the medical return of psychedelics. He also co-directs the Freedom of Thought Research Network.
On 3 July, Bublitz will deliver the keynote lecture "Neurotechnology and International Human Rights" at the IDP Congress on Neurotechnology and AI: Limits and Frontiers, organized by the Universitat Oberta de Catalunya (UOC).
What are neurotechnologies and why have they become a pressing issue for legal scholars?
Neurotechnologies are devices that measure or alter brain and nervous system activity. Examples range from brain imaging scanners for diagnosing disorders to deep brain stimulators permanently implanted inside the brain. The latest developments include brain-computer interfaces that connect brains to machines, for instance, to enable paralyzed people to communicate.
While not new, recent AI-driven advancements have significantly expanded neurotechnology's potential. Devices are increasingly deployed in medical and psychiatric contexts, and some are even sold to consumers for non-medical purposes such as gaming, wellness and cognitive enhancement. The consumer market is expected to grow substantially in the coming years.
What makes neurotechnology philosophically and legally interesting is its potential to breach the "inner citadel" of the mind. It allows technological access to mental states and offers interventions which promise to be more precise than drugs. This may vastly expand the currently limited powers over people's minds. This raises concerns, especially because some companies in the field have questionable records on privacy in other digital technologies. The ambitions of firms like Elon Musk's Neuralink to "unlock human potential" even in healthy people demand ethical and legal scrutiny.
What human rights challenges do neurotechnologies pose? Are today's legal frameworks sufficient?
Historically, those in power have often tried to influence people's minds, from religious inquisitions to totalitarian regimes. Today, neurotools are implemented in criminal interrogations and even involuntary treatments in some countries. Such uses, while still limited, are technically conceivable and underscore the need for protective regulation.
Similar risks also arise from private companies. Picture this already technically feasible scenario: your brain activity is monitored while you play a game or browse online, subtly testing your reactions to specific stimuli – say, red cars. This may reveal preferences and desires that people may not have fully understood themselves, which can of course be used for personalized advertising and other forms of influence. While human rights law doesn't bind companies directly, governments must protect individuals from such practices.
Yet, neurotechnologies also hold promise for people with physical or mental disabilities. The Convention on the Rights of People with Disabilities requires governments to promote accessible assistive technologies. Ensuring access to medical neurotechnologies is a human rights issue in itself.
Do we need new rights like "neurorights" or can existing ones be adapted?
The idea of "neurorights" is gaining traction, but I find it misguided. It implies that existing rights are insufficient, which I believe is incorrect. Rights like freedom of thought and mental integrity already exist and should be strengthened, not replaced.
Freedom of thought, formulated after WWII, encapsulates the idea that certain parts of the human mind must in principle remain off limits. Mental integrity, introduced in the Americas in the 1960s and adopted by the EU in 2000, similarly provides holistic protection to people.
The neurorights discourse dangerously ignores the existence of these rights and creates problems where there are none while distracting from the real challenges. Rather than creating new rights with potentially weaker protections, we should focus on interpreting and enforcing existing ones in the context of neurotechnology. This is the strategy that international bodies like the UN are pursuing, and I think they are right to do so.
How can we legally distinguish between influence, manipulation, and coercion in brain-computer interfaces and enhancement tech?
Drawing clear lines between persuasion and undue influence is notoriously difficult. The EU struggled with this in its AI Act. Definitions will always be context-dependent. Relevant criteria include how much control and awareness users possess over an intervention, whether it bypasses rational thinking, and more generally, whether it respects them as self-controlled beings or exploits their weaknesses. The aims of the interventions are also relevant. While many grey areas exist, I argue for a default prohibition on non-consensual neurointerventions into minds, allowing exceptions only under extraordinary circumstances.
Why is freedom of thought gaining renewed importance in the neurotech era?
Freedom of thought is a paradigmatic Enlightenment ideal: thinking for oneself, independently of authorities such as church, state and public opinion. Today, this right should set boundaries against interventions into thought, including the unauthorized revelation or alteration of thoughts and thinking. Neurotechnologies that could read or manipulate unexpressed thoughts must be strictly regulated. The right to freedom of thought, properly interpreted, could serve as a powerful safeguard against such invasions. This is the promise and the potential of the right, and I hope that human rights bodies and courts will adopt this interpretation.
What role should international organizations and courts play in regulating neurotechnologies?
UN bodies can shape how rights are interpreted, which is especially important for currently underdetermined rights to mental integrity and freedom of thought. Recent efforts by the UN Human Rights Advisory Committee, UNESCO, and the OECD are positive examples. While courts ultimately define rights, it may take years for cases to reach them.
In the meantime, international organizations can fill gaps by issuing soft law guidance, interpretive frameworks, and best practices. Though non-binding, these tools can influence national laws and shape stakeholder behaviour.
How does the return of psychedelics to medicine intersect with cognitive liberty and regulation?
Psychedelics like psilocybin and MDMA may soon be used to treat depression or PTSD. Countries like Switzerland and the Czech Republic already allow their medical use. While ethical concerns exist, they are manageable. Fears of biopolitical control seem misplaced since these therapies are voluntary and target severe conditions.
The shift in perspective, from strictly scheduled drugs to sources of healing is mainly a result of recent medical research. But it was also made possible by advocates of cognitive liberty – the idea of sovereignty over one's consciousness. Scholars and activist who championed this idea in the last decades helped to create the space for today's breakthroughs.
The intriguing question at the horizon now is whether cognitive liberty extends beyond medical contexts. Biotech startups are exploring ways to induce specific mental states without the severe side effects of many today's illicit drugs. This raises all sorts of questions about whether specific mental states should be encouraged or prohibited, and who is making these decisions.
What are your top policy recommendations to balance innovation with human dignity?
First, establish clear and non-negotiable red lines: no non-consensual interventions altering or revealing thought – even if that complicates the work of law enforcement or other authorities. Second, strictly regulate consumer neurotech, perhaps even a moratorium on non-medical brain implants. We might also need new criminal offences for severe interference.
Thirdly, however, avoid fear-driven overregulation. Sensationalist headlines often distort actual risks. Excessive restrictions could hinder promising developments. Let's not forget that nearly half of us will develop some sort of brain disorder in our lives. Neurotechnology could alleviate this suffering. We must encourage medical innovation while tightly regulating non-medical applications.
Press contact
-
Núria Bigas Formatjé