Thursday, June 5, 2025
No menu items!
HomeNatureBrain-reading devices raise ethical dilemmas — researchers propose protections

Brain-reading devices raise ethical dilemmas — researchers propose protections

For around two decades, Ann Johnson has been unable to walk or talk after she experienced a stroke that impaired her balance and her breathing and swallowing abilities. But in 2022, Johnson was finally able to hear her voice through an avatar, thanks to a brain implant.

The implant is an example of the neurotechnologies that have entered human trials during the past five years. These devices, developed by research teams and firms including entrepreneur Elon Musk’s Neuralink, can alter the nervous system’s activity to influence functions such as speech, touch and movement. Last month, they were the topic of a meeting in Paris, hosted by the United Nations scientific and cultural agency UNESCO, at which delegates finalized a set of ethical principles to govern neurotechnologies.

The recommendations focus on protecting users from technology misuse that could infringe on their human rights, including their autonomy and freedom of thought. The delegates, who included scientists, ethicists, legal specialists and diplomatic representatives, decided on nine principles. These include recommendations that technology developers disclose how neural information is collected and used, and that they ensure the long-term safety of a product on people’s mental states.

“This document clarifies how to protect human rights, especially in relation to the nervous system,” says Pedro Maldonado, a neuroscientist at the University of Chile in Santiago who was one of 24 experts who drafted the recommendations in 2024. The principles are not legally binding, but nations and organizations can use them to develop their own policies. In November, UNESCO’s 194 member states will vote on whether to adopt the standards.

Consumer devices

The meeting considered a range of neurotechnology applications, including devices designed to be implanted into the body and non-invasive devices, which are being explored in medicine, entertainment and education.

Legislation already exists for implanted brain–computer devices in regions including the United States and the European Union.

But non-medical consumer devices, such as wearables, are less well-regulated. These devices raise ethical concerns owing to their potential to be rapidly scaled up, says Nataliya Kosmyna, a neurotechnologist at the Massachusetts Institute of Technology in Cambridge who helped to draft the recommendations. “It’s very critical to understand the infrastructure and scalability of these devices,” she says.

Each application raises specific concerns. For example, in educational settings, the delegates recommended prohibiting the use of neurotechnology to evaluate students’ or educators’ performance in ways that might enforce inequalities.

Another concern about non-invasive devices is that information such as eye movement and tone of voice could be used to infer neural data, including someone’s state of mind or brain activity. One potential application that the delegates want to govern concerns neuromarketing, whereby a person’s neural processes can be manipulated to influence decisions on commercial advertisements or political messaging. The experts were concerned that this activity could occur without proper consent when users are unaware it is happening — when they are asleep, for example.

RELATED ARTICLES

Most Popular

Recent Comments