zurück zum Artikel

Brain-Computer Interface: increasing use, risks, incomplete legal situation

Marie-Claire Koch
Jochen Lennerz

(Bild: Marie-Claire Koch / heise medien)

BCIs can help restore motor skills and communication. The potential is enormous, but the legal situation and ethics are still largely unresolved.

Reading thoughts, storing them, playing them back – what sounds like science fiction is already a reality in initial applications. Brain-Computer Interfaces, or BCIs for short, can read brain signals, translate them into digital data, and play them back into the brain. Paralyzed patients use them to operate virtual keyboards solely through the power of their thoughts. People who were completely trapped within their bodies can suddenly communicate again.

Pathologist Dr. Jochen Lennerz described these developments impressively at the Digital Health Innovation Forum of the Hasso Plattner Institute, while simultaneously issuing an unambiguous warning about the consequences. This is because the technology raises questions for which society, politics, and regulation have so far found few answers. “If you define the term functionally: for example, as controlling a cursor or a selection movement, then such applications are already possible today,” Lennerz told heise online.

According to Lennerz, BCIs could offer therapeutic opportunities and fundamentally change the relationship between humans and technology. Prospectively, it is conceivable to share experiences directly or to read out perceptions. This brings the technology closer to what was previously considered science fiction. “The potential of the technology is significantly greater than its current state, but we are still in an early stage of development,” said Lennerz.

According to Lennerz, scenarios could be imagined where soldiers control drones or robotic systems directly by thought. It could also become possible to extract visual information from the brain – that is, to analyze perceptions that the person themselves might not even be aware of.

This could lead to new forms of military reconnaissance and control that go far beyond today's technologies. BCIs would not only restore lost abilities but could also specifically enhance human performance. This could lead to global competition, with states trying to secure early access to these technologies.

The handling of brain data is a particularly sensitive area. Unlike traditional medical data, so-called neurodata potentially contains far more information than is currently used. Systems often access only a small part – for example, to control a cursor. “However, the raw data often contains significantly more information than is used for the respective application [...] This latent information raises questions about secondary use, because it is not always clear what content could potentially be derived from the data.”

“Another open point,” according to Lennerz, “concerns data protection beyond death.” Brain data is considered highly personal. Under current data protection law, the direct protection of personal data expires upon death, while specific regulations for neuro-related data are still lacking. This creates a legal gray area where the use, transfer, and secondary analysis of such data, in particular, remain unclear.

Risks also emerged at the individual level. In one case, a patient developed a strong emotional attachment to the device after implantation. After its removal, she reported a massive sense of loss.

Chemnitz psychology professor Bertolt Meyer, who wears a bionic prosthesis himself, classified BCIs in the hearing of the German Ethics Council “New Neurotechnologies – Ethics, Law, and Society” [1] as so-called “Human Augmentation Technologies,” which belong to the invasive device-based technologies for enhancing human capabilities. As an example, he showed Neuralink's Pong-playing monkey [2], among others. Societal acceptance depends crucially on whether a technology serves to restore abilities or enhance performance. Added to this is the risk of stigmatization and an increasing tendency to want to solve societal problems primarily with technology.

Furthermore, Dr. med. Philipp Kellmeyer, Junior Professor for Responsible AI and Digital Health at the University of Mannheim, warned in the hearing about new forms of pressure to use, self-modification, and growing dependence through consumer-oriented neurotechnologies. He advocates for taking mental integrity seriously as an independent protected interest and for systematically integrating participatory processes into development and regulation processes. Kellmeyer and other researchers have also called for a moratorium [3]: no implantable non-medical BCIs as long as their effect on the human mind is not sufficiently understood. Legal scholar Dr. Christoph Bublitz from the University of Hamburg, who participated in the call for the moratorium, also pointed to unanswered questions about the psychological effects and freedom of thought.

The potential military use is particularly critical. According to Prof. Marcello Ienca [4], Professor of Ethics of Artificial Intelligence and Neuroscience at the Technical University of Munich, the first BCIs were originally developed for military purposes in the 1970s and 1980s. According to Ienca, China has been building specialized structures for cognitive warfare since 2023, which also include neurotechnologies.

Lennerz also contextualizes the role of major industrial players like Neuralink, whose BCI has already been implanted in over a dozen people [5] (at the end of January 2026, it was 21 people), upon request. “Neuralink, like other industrial players, is part of our Collaborative Community.” There, competitors consciously work together to create common foundations, “because they recognize that certain challenges are easier to solve together.”

According to Lennerz, Brain-Computer Interfaces could become one of the most defining technologies of the 21st century – with implications far beyond medicine. The crucial question is no longer whether they will come, but how their use will be shaped, regulated, and controlled.

Melden Sie sich zum KI-Update an Melden Sie sich zum KI-Update an [6]

(mack [7])

Don't miss any news – follow us on Facebook [8], LinkedIn [9] or Mastodon [10].

This article was originally published in German [11]. It was translated with technical assistance and editorially reviewed before publication.


URL dieses Artikels:
https://www.heise.de/-11228431

Links in diesem Artikel:
[1] https://vimeo.com/1175961893?fl=pl&fe=cm#t=25m30s
[2] https://www.heise.de/news/Makake-spielt-Pong-mit-Gedankenkraft-ueber-Gehirnchip-von-Elon-Musk-6010016.html?from-en=1
[3] https://pubmed.ncbi.nlm.nih.gov/41104262/
[4] https://vimeo.com/1175961893?fl=pl&fe=cm#t=2h36m49s
[5] https://www.heise.de/news/Zwoelf-Menschen-haben-inzwischen-Musks-Neuralink-implantiert-10640763.html?from-en=1
[6] https://www.heise.de/newsletter/anmeldung.html?id=ki-update&wt_mc=intern.red.ho.ho_nl_ki.ho.markenbanner.markenbanner
[7] mailto:mack@heise.de
[8] https://www.facebook.com/heiseonlineEnglish
[9] https://www.linkedin.com/company/104691972
[10] https://social.heise.de/@heiseonlineenglish
[11] https://www.heise.de/news/Brain-Computer-Interface-zunehmender-Einsatz-Risiken-lueckenhafte-Rechtslage-11226529.html