There are at least two ways to believe in the idea of quality. You can believe there's something ineffable going on within the human mind, or you can believe we just don't understand what quality in a mind is yet, even though we might someday. Either of those opinions allows one to distinguish quantity and quality. In order to confuse quantity and quality, you have to reject both possibilities. The mere possibility of there being something ineffable about personhood is what drives many technologists to reject the notion of quality. They want to live in an airtight reality that resembles an idealized computer program, in which everything is understood and there are no fundamental mysteries. They recoil from even the hint of a potential zone of mystery or an unresolved seam in one's worldview. This desire for absolute order usually leads to tears in human affairs, so there is a historical reason to distrust it. Materialist extremists have long seemed determined to win a race with religious fanatics: Who can do the most damage to the most people?
The reason [James Clerk] Maxwell's Demon cannot exist is that it does take resources to perform an act of discrimination. We imagine computation is free, but it never is. The very act of choosing which particle is cold or hot itself becomes an energy drain and a source of waste heat. The principle is also known as no free lunch.We do our best to implement Maxwell's Demon whenever we manipulate reality with our technologies, but we can never do so perfectly; we certainly can't get ahead of the game, which is known as entropy. All the air conditioners in a city emit heat that makes the city hotter overall. While you can implement what seems to be a Maxwell's Demon if you don't look too far or too closely, in the big picture you always lose more than you gain.Every bit in a computer is a wannabe Maxwell's Demon, separating the state of one from the state of zero for a while, at a cost. A computer on a network can also act like a wannabe demon if it tries to sort data from networked people into one or the other side of some imaginary door, while pretending there is no cost or risk involved.
The approach to digital culture I abhor would indeed turn all the world's books into one book, just as Kevin (Kelly) suggested. It might start to happen in the next decade or so. Google and other companies are scanning library books into the cloud in a massive Manhattan Project of cultural digitization. What happens next is what's important. If the books in the cloud are accessed via user interfaces that encourage mashups of fragments that obscure the context and authorship of each fragment, there will be only one book. This is what happens today with a lot of content; often you don't know where a quoted fragment from a news story came from, who wrote a comment, or who shot a video. A continuation of the present trend will make us like various medieval religious empires, or like North Korea, a society with a single book.The Bible can serve as a prototypical example. Like Wikipedia, the Bible's authorship was shared, largely anonymous, and cumulative, and the obscurity of the individual authors served to create an oracle-like ambience for the document as the literal word of God. If we take a non-metaphysical view of the Bible, it serves as a link to our ancestors, a window. The ethereal, digital replacement technology for the printing press happens to have come of age in a time when the unfortunate ideology I'm criticizing dominates technological culture. Authorship - the very idea of the individual point of view - is not a priority of the new ideology. The digital flattening of expression into a global mush is not presently enforced from the top down, as it is in the case of a North Korean printing press. Instead, the design of software builds the ideology into those actions that are the easiest to perform on the software designs that are becoming ubiquitous. It is true that by using these tools, individuals can author books or blogs or whatever, but people are encouraged by the economics of free content, crowd dynamics, and lord aggregators to serve up fragments instead of considered whole expressions or arguments. The efforts of authors are appreciated in a manner that erases the boundaries between them.The one collective book will absolutely not be the same thing as the library of books by individuals it is bankrupting. Some believe it will be better; others, including me, believe it will be disastrously worse. As the famous line goes from Inherit the Wind: 'The Bible is a book... but it is not the only book' Any singular, exclusive book, even the collective one accumulating in the cloud, will become a cruel book if it is the only one available.
Zombies are familiar characters in philosophical thought experiments. They are like people in every way except they have no internal experience.... If there are enough zombies recruited into our world, I worry about the potential for a self-fulfilling prophecy. Maybe if people pretend they are not conscious or do not have free will - or that the cloud of online people is a person; if they pretend there is nothing special about the perspective of the individual - then perhaps we have the power to make it so. We might be able to collectively achieve antimagic. Humans are free. We can commmit suicide for the benefit of a Singularity. We can engineer our genes to better support an imaginary hive mind. We can make culture and journalism into second-rate activities and spend centuries remixing the detritus of the 1960s and other eras from before individual creativity went out of fashion. Or we can believe in ourselves. By chance, it might turn out we are real.
What these critics forget is that printing presses in themselves provide no guarantee of an enlightened outcome. People, not machines, made the Renaissance. The printing that takes place in North Korea today, for instance, is nothing more than propaganda for a personality cult. What is important about printing presses is not the mechanism, but the authors.
Communication is now often experienced as a superhuman phenomenon that towers above individuals. A new generation has come of age with a reduced expectation of what a person can be, and of who each person might become.
Emphasizing the crowd means de-emphasizing individual humans in the design of society, and when you ask people not to be people, they revert to bad, mob-like behaviors.
The intentions of the cybernetic totalist tribe are good. They are simply following a path that was blazed in earlier times by well-meaning Freudians and Marxists - and I don't mean that in a pejorative way. I'm thinking of the earliest incarnations of Marxism, for instance, before Stalinism and Maoism killed millions.Movements associated with Freud and Marx both claimed foundations in rationality and the scientific understanding of the world. Both perceived themselves to be at war with the weird, manipulative fantasies of religions. And yet both invented their own fantasies that were just as weird.The same thing is happening again. A self-proclaimed materialist movement that attempts to base itself on science starts to look like a religion rather quickly. It soon presents its own eschatology and its own revelations about what is really going on - portentous events that no one but the initiated can appreciate. The Singularity and the noosphere, the idea that a collective consciousness emerges from all the users on the web, echo Marxist social determinism and Freud's calculus of perversions. We rush ahead of skeptical, scientific inquiry at our peril, just like the Marxists and Freudians.
This digital revolutionary still believes in most of the lovely deep ideals that energized our work so many years ago. At the core was a sweet faith in human nature. If we empowered individuals, we believed, more good than harm would result.The way the internet has gone sour since then is truly perverse. The central faith of the web's early design has been superseded by a different faith in the centrality of imaginary entities epitomized by the idea that the internet as a whole is coming alive and turning into a superhuman creature. The designs guided by this new, perverse kind of faith put people back in the shadows. The fad for anonymity has undone the great opening-of-everyone's-windows of the 1990s. While that reversal has empowered sadists to a degree, the worst effect is a degradation of ordinary people.
Something like missionary reductionism has happened to the internet with the rise of web 2.0. The strangeness is being leached away by the mush-making process. Individual web pages as they first appeared in the early 1990S had the flavor of personhood. MySpace preserved some of that flavor, though a process of regularized formatting had begun. Facebook went further, organizing people into multiple-choice identities, while Wikipedia seeks to erase point of view entirely.If a church or government were doing these things, it would feel authoritarian, but when technologists are the culprits, we seem hip, fresh, and inventive. People will accept ideas presented in technological form that would be abhorrent in any other form. It is utterly strange to hear my many old friends in the world of digital culture claim to be the true sons of the Renaissance without realizing that using computers to reduce individual expression is a primitive, retrograde activity, no matter how sophisticated your tools are.
But the Turing test cuts both ways. You can't tell if a machine has gotten smarter or if you've just lowered your own standards of intelligence to such a degree that the machine seems smart. If you can have a conversation with a simulated person presented by an AI program, can you tell how far you've let your sense of personhood degrade in order to make the illusion work for you?People degrade themselves in order to make machines seem smart all the time. Before the crash, bankers believed in supposedly intelligent algorithms that could calculate credit risks before making bad loans. We ask teachers to teach to standardized tests so a student will look good to an algorithm. We have repeatedly demonstrated our species' bottomless ability to lower our standards to make information technology look good. Every instance of intelligence in a machine is ambiguous. The same ambiguity that motivated dubious academic AI projects in the past has been repackaged as mass culture today. Did that search engine really know what you want, or are you playing along, lowering your standards to make it seem clever? While it's to be expected that the human perspective will be changed by encounters with profound new technologies, the exercise of treating machine intelligence as real requires people to reduce their mooring to reality.
Turing presented his new offering in the form of a thought experiment, based on a popular Victorian parlor game. A man and a woman hide, and a judge is asked to determine which is which by relying only on the texts of notes passed back and forth.Turing replaced the woman with a computer. Can the judge tell which is the man? If not, is the computer conscious? Intelligent? Does it deserve equal rights?It's impossible for us to know what role the torture Turing was enduring at the time played in his formulation of the test. But it is undeniable that one of the key figures in the defeat of fascism was destroyed, by our side, after the war, because he was gay. No wonder his imagination pondered the rights of strange creatures.
The attribution of intelligence to machines, crowds of fragments, or other nerd deities obscures more than it illuminates. When people are told that a computer is intelligent, they become prone to changing themselves in order to make the computer appear to work better, instead of demanding that the computer be changed to become more useful.
Information wants to be free.' So goes the saying. Stewart Brand, the founder of the Whole Earth Catalog, seems to have said it first.I say that information doesn't deserve to be free.Cybernetic totalists love to think of the stuff as if it were alive and had its own ideas and ambitions. But what if information is inanimate? What if it's even less than inanimate, a mere artifact of human thought? What if only humans are real, and information is not?...Information is alienated experience.
A file on a hard disk does indeed contain information of the kind that objectively exists. The fact that the bits are discernible instead of being scrambled into mush - the way heat scrambles things - is what makes them bits.But if the bits can potentially mean something to someone, they can only do so if they are experienced. When that happens, a commonality of culture is enacted between the storer and the retriever of the bits. Experience is the only process that can de-alienate information.Information of the kind that purportedly wants to be free is nothing but a shadow of our own minds, and wants nothing on its own. It will not suffer if it doesn't get what it wants.But if you want to make the transition from the old religion, where you hope God will give you an afterlife, to the new religion, where you hope to become immortal by getting uploaded into a computer, then you have to believe information is real and alive. So for you, it will be important to redesign human institutions like art, the economy, and the law to reinforce the perception that information is alive. You demand that the rest of us live in your new conception of a state religion. You need us to deify information to reinforce your faith.
A fashionable idea in technical circles is that quantity not only turns into quality at some extreme of scale, but also does so according to principles we already understand. Some of my colleagues think a million, or perhaps a billion, fragmentary insults will eventually yield wisdom that surpasses that of any well-thought-out essay, so long as sophisticated secret statistical algorithms recombine the fragments. I disagree. A trope from the early days of computer science comes to mind: garbage in, garbage out.
When I work with experimental gadgets, like new variations on virtual reality, in a lab environment, I am always reminded of how small changes in the details of a digital design can have profound unforeseen effects on the experiences of the humans who are playing with it. The slightest change in something as seemingly trivial as the use of a button can sometimes completely alter behavior patterns.For instance, Stanford University researcher Jeremy Bailenson has demonstrated that changing the height of one's avatar in immersive virtual reality transforms self-esteem and social self-perception. Technologies are extensions of ourselves, and, like the avatars in Jeremy's lab, our identities can be shifted by the quirks of gadgets. It is impossible to work with information technology without also engaging in social engineering.
An endless series of gambits backed by gigantic investments encouraged young people entering the online world for the first time to create standardized presences on sites like Facebook. Commercial interests promoted the widespread adoption of standardized designs like the blog, and these designs encouraged pseudonymity in at least some aspects of their designs, such as comments, instead of the proud extroversion that characterized the first wave of web culture.Instead of people being treated as the sources of their own creativity, commercial aggregation and abstraction sites presented anonymized fragments of creativity as products that might have fallen from the sky or been dug up from the ground, obscuring the true sources.
When we ask people to live their lives through our models, we are potentially reducing life itself. How can we ever know what we might be losing?
An imaginary circle of empathy is drawn by each person. It circumscribes the person at some distance, and corresponds to those things in the world that deserve empathy. I like the term empathy because it has spiritual overtones. A term like sympathy or allegiance might be more precise, but I want the chosen term to be slightly mystical, to suggest that we might not be able to fully understand what goes on between us and others, that we should leave open the possibility that the relationship can't be represented in a digital database.If someone falls within your circle of empathy, you wouldn't want to see him or her killed. Something that is clearly outside the circle is fair game. For instance, most people would place all other people within the circle, but most of us are willing to see bacteria killed when we brush our teeth, and certainly don't worry when we see an inanimate rock tossed aside to keep a trail clear.The tricky part is that some entities reside close to the edge of the circle. The deepest controversies often involve whether something or someone should lie just inside or just outside the circle. For instance, the idea of slavery depends on the placement of the slave outside the circle, to make some people nonhuman. Widening the circle to include all people and end slavery has been one of the epic strands of the human story - and it isn't quite over yet.A great many other controversies fit well in the model. The fight over abortion asks whether a fetus or embryo should be in the circle or not, and the animal rights debate asks the same about animals.When you change the contents of your circle, you change your conception of yourself. The center of the circle shifts as its perimeter is changed. The liberal impulse is to expand the circle, while conservatives tend to want to restrain or even contract the circle. Empathy Inflation and Metaphysical AmbiguityAre there any legitimate reasons not to expand the circle as much as possible?There are. To expand the circle indefinitely can lead to oppression, because the rights of potential entities (as perceived by only some people) can conflict with the rights of indisputably real people. An obvious example of this is found in the abortion debate. If outlawing abortions did not involve commandeering control of the bodies of other people (pregnant women, in this case), then there wouldn't be much controversy. We would find an easy accommodation.Empathy inflation can also lead to the lesser, but still substantial, evils of incompetence, trivialization, dishonesty, and narcissism. You cannot live, for example, without killing bacteria. Wouldn't you be projecting your own fantasies on single-cell organisms that would be indifferent to them at best? Doesn't it really become about you instead of the cause at that point?
The only hope for social networking sites from a business point of view is for a magic formula to appear in which some method of violating privacy and dignity becomes acceptable.
Linux is a superbly polished copy of an antique - shinier than the original, perhaps, but still defined by it.
One good test of whether an economy is humanistic or not is the plausibility of earning the ability to drop out of it for a while without incident or insult.
We're losing track of the vastness of the potential for computer science. We really have to revive the beautiful intellectual joy of it, as opposed to the business potential.
Anonymous blog comments, vapid video pranks and lightweight mash-ups may seem trivial and harmless, but as a whole, this widespread practice of fragmentary, impersonal communication has demeaned personal interaction.
Musicians and journalists are the canaries in the coalmine, but, eventually, as computers get more and more powerful, it will kill off all middle-class professions.
I've always felt that the human-centered approach to computer science leads to more interesting, more exotic, more wild, and more heroic adventures than the machine-supremacy approach, where information is the highest goal.
If there's any object in human experience that's a precedent for what a computer should be like, it's a musical instrument: a device where you can explore a huge range of possibilities through an interface that connects your mind and your body, allowing you to be emotionally authentic and expressive.