Skip to main
News

Q&A: Proposing a right to COGNITIVE LIBERTY

Originally published in Duke Law Magazine

Since her new book came out in March, Nita Farahany JD/MA ’04 PhD ’06 has been a frequent presence on the national and international stage, making numerous media appearances and presenting at the World Economic Forum in Davos, TED2023, South by Southwest, and many other venues. 

The Battle for Your Brain: Defending the Right to Think Freely in the Age of Neurotechnology (St. Martin’s Press) details how rapid advances in neural interface technology are making it possible to decode brain data, and how that development necessitates a new understanding — and encoding — of a human right to cognitive liberty, or mental privacy. 

The book’s release coincided with the broad release of ChatGPT, the artificial intelligence chatbot that can produce logically consistent answers to simple questions and has intensified fears of an AI-driven future. But it’s the culmination of more than a decade of thinking on the ethical, legal, and social implications of emerging technologies for Farahany, the Robinson O. Everett Distinguished Professor of Law and professor of philosophy, who is also the founding director of Duke Science & Society and faculty chair of the Duke MA in Bioethics & Science Policy.

At an April 10 author celebration sponsored by the J. Michael Goodson Law Library and Office of the Dean, Farahany discussed the ideas in The Battle for Your Brain with David Hoffman ’93, a senior lecturing fellow at the Law School and the Steed Family Professor of the Practice of Cybersecurity Policy at Duke’s Sanford School of Public Policy. The following is an edited transcript of their conversation.

DAVID HOFFMAN: Can you give us a quick overview of the books?

NITA FARAHANY: The quick and dirty is that neural interface is going to become one of the primary ways we interact with all the rest of our technology. It’s going to be in our everyday devices and it’s already happening, it’s just a question of scale. The book goes through each of the different contexts in which it’s already happening, and it builds across bioethical dilemmas in each chapter. 

The right to cognitive liberty as a human right is what I propose to enable us to have self-determination over our brains and mental experiences. The right to self-determination includes the right to access our own brain activity, to enhance it and to change it, even to diminish it. 

“The ability to reclaim our brain health and wellness and to be able to solve neurological disease and suffering, or at least have major new ways to address it, is incredibly promising.”

DH: One of the things that you and I both share is an inherent optimism of the power that technology can have to improve people’s lives. Where is this technology headed and what kinds of things can it do?

NF: Your brain health and activity are really fundamental to what it means to be human. And so the next big market and big thing is to track brain activity. As the sensors have gotten much better and can be embedded in earbuds and headphones and even little wearable tattoos behind your ear, people can track things like their focus, their attention, their emotions, or if they’re going to have an epileptic seizure, minutes or hours beforehand.

From migraines to depression to Alzheimer’s dementia, all of these things have early neural signatures that can be detected. And the basic brain states that can be decoded, or meditation that can be enabled, are pretty extraordinary, even with the devices where they are today. Once people start wearing them in much greater scale, there’ll be even bigger data sets from which AI can use pattern recognition to learn even more. The ability to reclaim our brain health and wellness and to be able to solve neurological disease and suffering, or at least have major new ways to address it, is incredibly promising.

And that doesn’t even begin to talk about the ways in which we can so much more seamlessly interact with the rest of our technology. People who’ve used neural interface to play video games or move a cursor, people who have neurodegenerative diseases and are able to connect with those devices say it’s revolutionary. It’s a fundamentally different way about thinking about how we interact with the world around us.


DH: Could you talk a little bit more about the advances in analytics and AI and how it intersects? What does that mean for the benefits that could be created from some of this data?

NF: On one hand, I would like for all brain data to be kept on the device and overwritten and nobody else to have access to our brain data, because once they do the risks really become profound: the risk of employer misuse, the risk of corporate misuse and commodification to microtarget advertisements to us, the risk of government use and misuse of the data. 

On the other hand, 55 million people around the world suffer from dementia and 60% to 70% of them suffer from Alzheimer’s disease. More than a billion people suffer from a mental health or drug use disorder, and 300 million people suffer from depression. These are profound tolls on society. Real-world, everyday engagement with brain sensors while we go about our everyday lives would truly be a treasure trove of data in the hands of scientists and researchers who are applying it for the common good. 

So how do we collect brain data in ways that puts it in hands of researchers and scientists, but not in the hands of corporations who can commodify and misuse it, or governments who can peer into it? I believe cognitive liberty is a starting place because it flips the terms of service in favor of individuals. I propose that if you have a right to mental privacy, there has to be a bona fide legal exception to gain access to that brain data, and it has to be narrowly tailored for a specific use case and purpose. And it has to truly be justified based on the nature of the intrusion relative to the interest of the individual. 

Image
brain on a smartwatch

DH: What will the consequences be if we don’t do something to protect people’s interests?

NF: Already in workplaces worldwide, brain activity is being monitored for things like attention and fatigue levels. Corporations are already using it to do neuromarketing. Governments are already interrogating the brains of criminal suspects to see their recognition of different salient aspects of crime scenes. 

Governments are investing in brain biometrics so that when you go to the airport, instead of having your irises scanned, you would sing a little song in your head, and that could be decoded to authenticate you, which would be a very secure form of authentication. But that means that you have government access to the brain, and in the same way that governments already subpoena Fitbit and Apple Watch data to convict people of crimes, you can be sure they will collect brainwave data if it’s accessible to them. 

The last and creepiest is cognitive warfare. The risk of hacking into these devices and manipulating them, especially as you start to think about generative AI and the closed loop that’s being created with them, is chilling. 

We could go on all day about the terrifying and dystopian aspects of it, because I think this could be incredibly empowering and also potentially the most oppressive technology that we’ve ever unleashed on society.

“This technology has arrived, but it hasn’t gone to scale yet. So we have a moment before that happens.”

DH: How could the right to cognitive liberty have us bend more towards utopia than to dystopia?

NF: This technology has arrived, but it hasn’t gone to scale yet. So we have a moment before that happens. And I believe that if we recognize a right to cognitive liberty, which requires updating the international human right to privacy to include mental privacy, the expansion of that understanding of freedom of thought is important and powerful. 

There’s this idea that undergirds most of human rights law — the right to self-determination. I think we should explicitly recognize an individual right to self-determination in the age in which our brains and mental experiences and so much of our bodies can be accessed and manipulated and changed by others, a right for us to be able to have autonomy and to exercise that dignity over ourselves. 

I think recognizing it both explicitly and also starting to understand it as a liberty interest would go a long way. And my hope is that by giving a framing to it, by giving a conversation to it, by giving a thing that we can rally around both in law and principle, that it will help shape norms and what we expect from corporations and what we expect from governments going forward.