Transcript for Eben Moglen’s Keynote at re:publica 2019

June 17, 2019

On Tuesday, May 7th, 2019, SFLC President and Columbia Professor of Law Eben Moglen delivered a keynote address titled Why Freedom of Thought Requires Attention at the re:publica 2019 conference in Berlin, Germany. We are pleased to announce that the transcript of this keynote address is now available.

Video: You can find a recording of the keynote here.


This work is licensed under Creative Commons Attribution-ShareAlike 4.0 (CC By-SA 4.0).

This transcript was largely written by Pedro de las Heras Quirós. The Software Freedom Law Center thanks him for his work.

Title: Why Freedom Of Thought Requires Attention

Speaker: Eben Moglen

Date: Tuesday, May 7th, 2019

Location: Berlin, Germany

Conference: re:publica 2019, which asked speakers to discuss the theme “tl;dr,” internet slang for too long; didn’t read.


Good afternoon. It’s a pleasure and an honor to be back at re:publica.

Seven years ago I came here to talk about how freedom of thought requires media that don’t listen and watch and surveil us as we use them. And why media that consumes us in the end consumes freedom of thought.

Three years ago Mishi Choudhary and I came here to talk about the last kilometer of the net, the part that is closest to you, which had become the behavior collection system embracing humanity and began the process of making surveillance and autocracy perfect.

Now we are approaching the end of the most fateful decade for the human mind since the french revolution and everybody is beginning to notice something. And our theme this year for all of us is what is happening to the nature of the human mind.

We built a new neuroanatomy for the human race, one which provides an exoskeleton nervous system for all the billions of us. It begins the process of turning us all of humanity into a superorganism.

A process that many of us old enough to have been around at the beginning of all of this welcomed for the possibility of the liberation of the human mind that it presented. Every human brain learning everything that it wants to learn without being constrained by cost or power. A revolution which we sought to bring about with free/libre software, with inexpensive hardware, with the bandwidth that could allow every brain on earth to learn. And we have been for the last decade coping with the horrid unexpected or at least unintended consequences of that revolution.

So we built this network that embraces every human mind within another generation every brain of every human being on earth will be formed directly in connection with the net from birth onward.

Every human mind will develop in the context of all the other human minds connected to the net and all those non human agents beginning to develop inside that nervous system we are building.

But we turned our net dark on ourselves when it came time to articulate the physiology that that anatomy makes possible. So we wired up everything so that every human brain could access everything. But we built media that consumed us as we consumed it.

My friends in the Soviet Union, when there was a Soviet Union, knew, as they used to say, that the only free cheese is in a mousetrap, which proves of course that the Union of Soviet Socialist Republics was not a bad place to learn the fundamentals of surveillance capitalism. That’s what we did, right? We turned the network backwards on itself so that the services, first telecommunications, storage, time management, search, etc. etc., all the services became the cheese on the stage of the moustrap, offered for free in return for information about each human being in real time and the opportunity to push.

The neuron at the end of the network, the one closest to you most of the time is a spy satellite: the densest collection of sensors, gram for gram, ever engineered by human beings. We’ve stuff in orbit that’s pretty dense with sensors.

The americans used to brag during the Cold War that we had things in orbit that could take a picture of a golf ball on the surface of the earth. But gram for gram, that thing you’re taught to call a smartphone—it’s smart and it’s just a phone, so don’t be afraid of it— gram for gram that smartphone is denser with sensing than the spy satellite in orbit and it’s all aimed at you. That’s the neuron closest to you in the net most of the time. It’s very cheap to take pictures of golf balls now. We don’t need to put a thing in orbit to take a picture of a golf ball on the groud: everybody has a little camera ane they take a picture of their own golf ball and they put it in the net and we have it instantly.

That network, equipped to sense the human being at its end more completely than anything else, carrying a battery that gram for gram stores more power than a nuclear power plant, because of course you need a lot of enertgy for all those sensors, that neuron creates a network whose purpose now is to acquire your behavior as much of it is possible. And to offer you services that justify the degree of information you are giving up in return for the services you are receiving—pretty amazing services is you consider email and calendaring and stuff that any little box can do to be amazing services—what is really amazing is the price at which they are delivered to you, which isn’t zero. It’s the cost of who you are. The surrender not just of what you know about yourself but all that can be inferred from your behavior sensed all the time.

And then the opportunity to push back at you. The computers that we use now push us very hard. Tens of thousands of times a day they show an advertisement and measure our response thousands of times a day. In a study more that 2000 of young people in the US the phone gets physically interacted with, more thatn 2000 times a day. In the past history of the human race, if there were a thing in your pocket that you touched thousands of times a day it would be assumed to be the result of obsessive compulsive disorder. Now it’s normal.

This is the network within which in one generation every human mind will be formed from birth. Physiologically this is not good for human beings. Intellectually this is not good for the human race.

Economically it makes a ton of money. Shoshana Zuboff, in her masterpiece last year, The Age of Surveillance Capitalism [1], offered an account of the political economy that now drives this network. The structure of selling human attention by rapid but, I mean really rapid, auctions conducted trillions of times a day to reach billions of people to make them behave.

It’s very important to recognize that once behavior collection is what the network really does then what it attempts to maximaze is behavior. This lies beneath the increasingly important intellectual discipline referred to as persuasive design. That is, the design of hardware and software to cause you, to persuade you, actually to addict you, to registering your behavior through the sensing system: clicking, swiping, moving.

Behavior is what the network now metabolizes. It eats yours. And it lives off that behavior that it acquires. It emits stimuli designed to cause more behavior.

Persuasive design, we may say, is a bunch of geeks doing with math and science what Steven P. Jobs did by art, right?. Persuade you to hold it, to touch it, to interact with it, to sense your wishes and your needs in it, to deliver your anxieties to it. This is what we mean when we say it’s so convenient.

That’s the physiology of the network now: it metabolizes behavior and it emits stimuli for new behavior. It is in that sense parasitic: it lives off us; it makes a good living. But it also carries the knowledge of all that human behavior acquired. It has a view of the nature of human existence which is very special. Embracing [both] the most specific possible awareness of your life: how you feel, what you act, what you do, how fast you move, what you’re looking at, what you’re thinking about—at least so far as the search box and your swiping behavior can tell them.

[It does it] At the same time that the network contains a synoptic view of human behavior billions of people, all behaving, all adding up their behavior to everybody else’s behavior to offer a picture to the Machine of the large scale structure of human activity. From the very smallest—your momentary reflexes, your momentary movements of attention— to the very largest questions—how do humans behave, what are the patterns we cn find there, how can we recognize.

That is after all what we call machine learning now: recognition of pattern. How can we recognize the patterns of human behaviour on the largest scale, from the smallest to the largest, from that which is precisely descriptive of you to that which is imminent to being human, all collected all the time, nothing ever lost or forgotten.

In the whole long history of human race we’ve had really only one way of describing that view from every sparrow falling to the destiny of the world. We called it, when we thought of it at all we called it the mind of God. And so what have we now? What have we wrought, what is this network in which we are now wrapped everywhere all the time? The bad news is it’s the parasite with the mind of God.

So where are we? What is our relationship to this network now? We behave for it. What the network needs, what it takes from us, what it drinks, its substance is our behavior. And it makes more of it. The human attention system, the quality of our distinctively human minds, that ability to focus and to think about something, to hold it in a mental space of our own, is where the network chemically alters who we are.

The ability to pay attention is a thing we learn, partially in utero, partially as a descent from those who came before us, but in every human organism a learned, acquired skill.

Those who work on understanding the psychology of human infants can point to the ways in which the attachment between the caretaking adult human and the infant generates the nature of attention. But the machines view of our attention, well, it’s a raw material, a resource for extraction.

All those artifacts sensing you, stimulating you, producing behaviors so they can acquire the behavior thus produced, conscript attention. We build the things so that they are good at conscripting your attention. They make attractive noises, they vibrate in fulfilling fashion. They interrupt the stream of internal life in order to deliver now this very important message about the job your friend just got, or the vacation your pal just took, or here is an advertisement, do you want to click on it?. But after all of course, the stream—stream is the right word, it’s continuing undestructible, uninterrupted, natural as water falling down the side of the hill stream of what comes to you— drowns the inner dialog, removes interiority, changes your understanding of time: too long, didn’t read. How long is too long? As long for example as Tolstoy’s Rat i Mir? (that’s the Russian title of “War and Peace”.)

In the 20th century, at the end of the 20th century was the Golden Age of reading, it turns out. Access to almost everything, and the mental facility to hold all that reading. Too long; didn’t read War and Peace? Yes, I can believe that. But when I was a college senior, in a half-year modern complete course we read Moby Dick and War and Peace and Anna Karenina and Ulysses and A Hundred Years of Solitude in fourteen weeks. We did because we could, we did because we were trained, because we were lucky, because we were privileged people in a world which allowed us to develop our minds that way. But also, though we didn’t know it, we couldn’t have conceived, even though I was then developing programming languages at IBM and I thought I knew a bunch about technology, we couldn’t have foreseen that the people who came after us would be crippled in acquiring that ability by the incessant disturbance of the machines seeking attention.

What is happening to our attention span is a scientific object of study. Last month, in Nature Communications four authors led by Philipp Lorenz-Spreen document by multiple longitudinal studies of everything from movie ticket sales and book publications to twitter handles to twitter hashtags and Google searches the extent of the shortening of attention span over time decades. [2]

The dynamic acceleration of the process of competition for attention accompanied by a simple mathematical model of the nature of that dynamic shortening of attention. We can measure the extent to which we are fracturing the coherence of the human mind. And we can feel it when we wathc our own interactions with the people and the network around us.

I admit to being excessively old-fashioned. I run a lot of servers—fourteen of them— I have a lot of endpoint stuff around me—thinkpads, chromebooks without Chrome on— and all sorts of similar things. And never , never, never , never at all ever do I see an advertisement on a computer. That’s the way I run my stuff, right? Because I can’t help it. Because I started using computers when I was twelve in 1971, and the gravitational consequence of having grown up computing without disturbance leaves me hyper sensitive to efforts to control my thinking from yet another message, from yet another friend, or “friend”.

So I’m speaking out of a condition of ignorance and helplessness with respect to how you all manage this. I don’t know how you can. It’s a triumph, an extraordinary victory that you retain the ability to think despite all that disturbance. If your sleep were as disturbed as your waking life it would be sleep deprivation torture and you would time. Do you wake somebody up seeral hundred times a night to tell them something? You’re hurting them. Everybody knows that because the physiological relationship to sleep is so demanding. But something bad is happening to you when the lights are on too and you triumph over it because the people who were born on the cusp of this and your habits thought they may not be as hyper retrograde as mine are still the habits of people whose minds were formed not in relation to the net. You still maintain a sense of the purpose of human thought which assumes that your thinking is for soe purpose other than the creation of measurable behavior. Bu pretty soon you’re going to be outnumbered. Because that’s not what it feels like. To be an infant connected to this net from the beginning of life.

I have a street photography project in New York that I call the 21st century Madonna. On the street I photograph people carrying an infant in a front carrier and the adult is looking at the phone and the infant is looking at the adult looking at the phone. That’s the birth of a new cultural system in the human race spreading faster than Christianity or Islam or any other fast-moving cultural innovation in the history of humankind, changing who we are by changing the purpose of our thinking.

You may notice there are also some consequences for politics. That is to say we are experiencing a decline in public rationality. We are experiencing it not merely in one society or another. We have not merely Trump or also Brexit. We are experiencing an attack upon our own ability to deliberate. It’s hard, it is, when we increase the flow of information. When we maximize the accessibility of everything. The problem of ascertaining turths becomes harder: there’s more to sort through this is correcto. But the most important change is not the volume of the material requiring analysis. It’s the time in which analysis needs to occur. The space in which it happens.

In the European world we came to our interior through the Reformation. We came to the nature of how we understood our inner lives through a change in the religious ideology of the societies of the north of the European Peninsula, the ones where the word, where literacy driving religious contemplation privileged the individual over all other coercive power in society. Literacy drove internality and internality drove democracy: the sense of the individual as the unit of decision-making.

We are now terrified of algorithms, and we are justifiably spending a fair amount of time on the social consequences of this or that way of automating decision-making in organizations. We are ironically less concerned about the consequences of non algorithmic thinking, impulsive thinking, merely responsive thinking in the individual human brain, increasingly given over to activity of swiping and clicking and watching random pieces of the passing show go by through the status updates and Instagram commentary and all the other pieces of what we are pleased to call social media.

And what is decaying is that private internal space in which truth is sorted out and the human being decides for herself what is the purpose of the world, what is the meaning of my existence here, how should I live.

Not surprisingly the machine is not interested in how you should live. It’s interested only in how much you behave. Political dialogue becomes about striking poses, acting. Much less about considering, let alone reconsidering, taking something into an internal space of your own and thinking about it over time.

You do win a daily victory, you still achieve that, against the enormous pressure of the devices you keep around you and the way they work. But you cannot keep it up forever. Or rather, like me, you can insist upon holding a particular technological relationship to the machine, you can preserve some mental space of your own. But how do you allow the generation coming to behind yours to inherit that?

So here is our problem. The attention span of the human race collectively is shortening. The precious autonomy of internal mind that we gained barely half a thousand years ago, and which we used to make democrracy and the right of the individual and the sense of self-development intellectual and social which we call freedom, is threatened by our own habits as the y take hold in this net we do not want with these media that consume us as we consume them.

Like the carbon dioxide in the atmosphere this is a colorless, odorless, tasteless change. And understanding the second-order consequences of this pollution we are polluting our inner self with is tricky, complicated business. It requires systematic thinking, not overreaction, not panic, not a sense of revulsion or disgust. Not an anti-modernist reaction. Not break all the machines, destroy the network. We built this to liberate humanking. We can’t just take it apart and destroy because we have allowed to twist in our hands and bite us back.

We have to fix it. We have a fairly short period of time. We wasted—I don’t mean the people in this room, we are the people who has been worried about this all the way long— but we collectively have wasted crucial time.

And the clock now runs against us as it runs against us on the planet. Too great environmental disasters, each requiring our fullest concentration and our broadest and our broadest social mobilizations. We’re busy. We’re following on one another’s feeds. We’re having trouble concentrating on the main events. We’re not so easily able to keep our eyes on the prize. Too long. Didn’t read. I mean, John Lock’s essay on Human Understanding for example: much too long, didn’t read. Das Kapital? Way too long. Even in the 20th century it was too long. But we need them. We need all that accumulated rumination we called civilization once upon a time, when we weren’t displacing it with the chemistry of the machine-man relationship.

So we’d better do something about this. Time is growing very short. In fact I wouldn’t really come and seven years in bother you again about this if I didn’t think it could be fixed. But remember that those of us whose minds were formed outside the context of the net have got to do it. It’s much, much harder to get out than it is never to have gotten in in the first place. My poor students in law school who want privacy, but everything they’ve got around them is built against it. It’s easy for me simply to refuse: I don’t have a Google account, I’m sorry. But what do your do when your life was given away at the beginning: your parents put you on the services, you’ve lived your entire life with those services and you do not know how to use a calendar on the door of your refrigerator anymore.

So we, those of us in that liminal period as the nature of human thought was changing, as the purpose of ideas went from being constructed inward to signalling outward, those of us who live here now, we have to deal with this. We can’t leave this for the future. Not yet are the young blocking the bridges on us to force us to deal with this. Not yet. Which means we better do it now before it gets that bad.

What must we do? We need to change the political economy of services. The people in this room are the producers of the political economy of services, not the beneficiaries. I don’t mean that you’re all the shareholders of it, but you know how it’s made. Your ran the pipes as I did to once upon a time. We need to make it possible to federate all services in the net. Why is such a technical goal so terribly important? Because when services are federated they are no longer paid for by exclusive access to the internal lives of human beings, and they are no longer justified by the profitability of advertising sales. When we are taking in one another’s washing, when our messaging and our calendaring and our storage, and all other piddling commodity services that we use, when we are delivering them to one another, cooperatively, we are turning the net back in the direction that we orignally wanted, the one in which it acts to liberate us through individual effort to teach and learn. The federation of all services is not an inconceivable idea, most of the services we have were meant to be federated. The net was designed for it. We are undoing problems rather than making terribly complicated inventions.

This is the intented goal of the little gesture I call FreedomBox. The manufacture of simple, inexpensive self-administrating servers that we can hold in the palm of our hand and distribute throughout the world like apple seeds. A great intercontinental activity, the software for FreedomBox is made everywhere from Hidelberg to Seattle and current versions of the boxes are manufactured in Bulgaria.

Because of course we do have the value of the net to make our free software in to spread our inexpensive hardware over the world to do all the things we need to make this neuroanatomy payoff not by centralization but by diffussion. Which brings with it naturally the goal of intellectual self development at those human endpoints that we were seeking in the first place.

We need to understand that the business we have called telecommunications and social media are behavior collection business. We need to regulate them not as telecoms or platforms but as behavior collectors. We need to understand the business of behavior collection like the business of coal mining or steel production, as a business. Subject to regulation in the public interest on the basis of what it is, not what it claims to be, not what it hopes to be, not what it says on the shareholders pamphlets that it is, but what it does to society, which is what regulation is about, what happens to society.

GDPR is the beginning of an effort to think about the regulation of behavior collection, but it still depends upon the idea of consent by individuals to the collection of behavior as the relevant category for the legal administration of this business. Consent is not the heart of behavior collection, it’s an environmental subject: everybody’s actions affect everybody else’s actions, therefore individual consent should not be the test. What are the rights, what are the standards, what are the social objectives within which behavior collection can be not just allowed but accepted?

We need to appreciate that the goal of the network is not the constant subsidiary mental activity of human beings. The goal of the network is not push, it’s pull. That is to say, all of the effort to make the network operate on internet time is a form of pollution. What we really wanted was for human beings to initiate requests for what they want, what they need, what they wish for, what they think and what they learn.

This was the subject of our effort in making the net and the free software that runs it in the first place. To give users rights. But also to encourage “pro-sumption” [[that is, the combination of production and consumption in one hand, which characterizes the possibility of the digital economy of free software and free culture]], to encourage those who use to scratch an itch, to learn how to make, to become better makers, ultimately to serve others’ needs as well as their own by cooperating in learning and making knowledge.

We need to understand that as a concrete explicit social goal. The jangling in the net, like the noise in the street, may be a collateral consequence of what we do. We had a transportation revolution too in fact in the last two hundred years. They’re very noisy. The world was quieter when it had only horses in it, but we live with the noise for a good reason. It is not clear what the objective of social regulation of the net ought to be unless it is the improvement of human intelectual life. The broadening to be sure, the maximizing of access, but quality is important too. We are not just seeking to increase the quantity of human thought although that is where the great big social benefits are, in the extirpation of ignorance. But we are also still trying to improve how we think as well as how much.

And it should be the clear goal of social policy to help us make the net quieter. Our politics depends upon it, I need not say. The health of our democracies depends upon it. The nature of our educational process depends upon it. The freedom of our grandchildren depends on it.

TL;DR is the crisis of the human mind in five characters.

It’s good. We’ve got a clear understanding of the problem. But we need to summon just enough longevity of thought to go about the business of fixing it, which means technology, politics and law driven for the purpose or reanimating human concentration, reducing our attention deficit disorder induced by the nature of our market for services and the character we’ve built. We have to do that.

The rising sea level is becoming evident to people, to the point at which denial has become, well, an exercise in deception. What is happening to the human mind is not quite so unambiguously clear just now. Most of the human race is most aware of an increase in access and convenience.

Most of the human race has alway lived in a condition of ignorance and cultural deprivation because that’s what we do to the poor. And as Aristotle noted, the poor are alway many and the rich are always few. But though we have a clear understanding of how things work—we, us in this room—we are not yet capable of explaining to most of humanity why the benefits they are receiving are coming at a very high and ultimately unsustainable cost.

For those who are hungry, who yearn for knowledge, who yearn for the ability of self-expression, who yearn to reach out and connect other human beings, and teach and learn from them which is the human way, for those people who are deprived, free cheese is a good thing, and it’s not their fault that we built the mousetrap and we didn’t tell them.

So we have an ethical obligation, a moral requirement. We have to deal with this. We can. We should. Human nature depends upon us to get it right. Time is running now.

Thank you very much.




Press Inquiries

Interested members of the press may reach out to

Other SFLC news...