Idle Words > Talks > Web Design: The First 100 Years

This is the expanded version of a talk I gave on September 9, 2014, at the HOW Interactive Design conference in Washington, DC.

Designers! I am a San Francisco computer programmer, but I come in peace!

I would like to start with a parable about airplanes.

In 1981, my mother and I flew from Warsaw to New York City in this airplane, an Ilyushin-62.

The Il-62 exemplifies a Soviet design approach I like to think of as "add engines until airborne".

Soviet engineers lacked the computers to calculate all the bending and wiggling the wings would do if you hung the engines under them, so they just strapped engines on the back.

This plane actually used a little kickstand when it was parked empty, to keep it from tipping over and pointing up like a rocket.

In those days, Warsaw had maybe ten flights a day. It was a big deal to spot a plane in the sky. The airport itself was a tiny concrete building not much bigger than this conference room, so my mom and I were completely overwhelmed on arrival at JFK airport, with its maze of gates. To top it off, neither of us spoke English.

We ran around the airport, out of breath, until we finally found our way to a waiting room where other passengers were sitting and waiting for the Pan Am flight to Houston. We sat down to wait, too. And then, without warning, the waiting room took off.

That was my introduction to the Boeing 747.

The 747 is a masterpiece of industrial design. Everything we think of as normal about air travel, for better or worse, was invented for this airplane and its immediate predecessor, the 707. That includes seats on rails, overhead bins, drink trolleys, sliding window shades, the little fan above your seat—you name it.

There are so many wonderful stories about the 747. It was two and a half times bigger than the largest passenger jet ever built. They had to make a special factory to assemble it, and they were still building the factory as the first planes came off the line. They were also still tinkering with the design. Engineers would run out onto the shop floor waving amended drawings, and annoyed foremen on the production line would cuss them out.

The 747 required over 75,000 technical drawings. All of them were done by hand. There was no computer aided design to help engineers figure out how to put everything together, just a massive filing system.

Boeing had to build a full-scale plywood model of the plane from these drawings to make sure everything fit together, and that multiple systems weren't trying to occupy the same space.

My favorite fact about the 747 is that it was built by the company's B team, Boeing's version of the Bad News Bears. All the top engineers, and the ambitious up and comers, had gotten themselves assigned to Boeing's prestige project, a plane called the 2707, or supersonic transport.

The SST was going to fly at almost three times the speed of sound, about 2900 kilometers an hour. It had swing wings. And it was the first-ever widebody design. Everyone believed that the SST was the future of jet travel.

The 747 was meant to be a stopgap. It was supposed to serve the airlines until the SST entered service in the 1970's, at which point it would be demoted to a freighter.

This is a Pan Am advertisement from the period, showcasing the 'planes of tomorrow' that were 'just over the horizon'.

In fact, that famous hump on the 747 is there specifically to make it easier to load freight. This was not a plane with a glamorous future.

It wasn't just Boeing working to build an SST. The Europeans were developing the Concorde, and the Soviet Union was hard at work on their own version.

This was a giddy time for aviation. Pan Am even started taking reservations for commercial flights to the Moon! They had a waiting list with over 90,000 names.

The point of my parable is this: imagine if you could travel back in time and offer to show one of those Boeing engineers what air travel would look like in 2014, fifty years on.

What might he have expected to see?

Keep in mind that in 1965, the Gemini project had just started. Astronauts were in earth orbit, testing the technology and procedures needed for getting to the Moon. The Space Race was in full swing.

The Soviets and Europeans were both developing a giant supersonic airliner.

As this period chart shows, transportation speed was increasing at an exponential rate, and we were just about to head up that steep slope towards interstellar travel. Though the underlying technologies kept changing, the overall trend was as clear as it was unstoppable.

One thing the engineer might have expected to see in 2014 was a radioactive wasteland. The Cold War was a grim reality, and many people expected it to end in disaster.

But if we had not killed ourselves, he would have expected moon bases, maybe a Mars base. He wouldn't be surprised to see flying cars everywhere, or atomic airplanes with unlimited range.

Without question there would be routine supersonic travel at unimaginable speed and comfort between any two cities in the world.

Consider what that engineer had seen happen in his own lifetime. The first attempts at powered flight took place right around the time he was born.
In the twenties, when he was a boy, airliners like the Junkers G-24 could fly 14 passengers at 170 kph.
The 1930's brought the all-metal DC-3. It could fly 30 people at 333 kph.
That was also the era of the famous Boeing Clippers. They had luxurious sleeper berths and took six days to get from San Francisco to Hong Kong.

In the 1940's, Boeing introduced the Stratocruiser, a pressurized plane that could fly at 480 kph.

Finally, in the 50's, Boeing ushered in the Jet Age with the Boeing 707, which could cross the Atlantic ocean at nearly 1,000 kph.

I submit to you that the last thing that Boeing engineer would expect to see in 2014 is what actually happened. Here is today's most advanced passenger aircraft, the Boeing 787.

Unless you are an airplane nerd, you would be hard pressed to distinguish the 787 from its grandfather.

And in fact, this revolutionary new plane flies slower than the 707.

The basic configuration of an airliner has not changed in sixty years. You have a long tube, swept wings with multiple engines mounted underneath, and a top speed of around 900 kph.

So what happened to the future?

It's not that the technology failed. We built, tested and flew giant planes that could cruise at over three times the speed of sound.

This is the Valkyrie, a massive strategic bomber painted white so that it won't catch fire from the flash of its own nuclear bombs. This plane was test flown in 1965 and nearly made it into production.

The SR-71 Blackbird, another Mach 3 plane, did make it into production, and flew for decades. It still holds all the speed records.

We even got supersonic airliners! The Concorde entered commercial service and safely ferried douchebags across the Atlantic for 25 years. If you're my age, you may remember seeing one taxi past you at the airport.

The Russians got in on it too, with a plane derisively called the Concordeski. This proved too loud and unreliable for passenger service, so it ended up being a transport jet. It carried fruits and vegetables from Central Asia at twice the speed of sound.

My favorite line from the Wikipedia article is that the plane was so loud, "you couldn't hear the passenger two seats away from you screaming". He had to pass you a note saying "aaaaaaugh".

The first time they took journalists up on this thing, there were so many alarms going off that the pilot had to borrow a pillow from the passengers to stuff into the alarm klaxon. But it flew!

And it's not like the space program was a failure. We landed men on the Moon not once, but six times.

Because we're Americans, we didn't just put men on the moon—we put cars on the Moon, three of them.

We even had a dude play golf up there.

Our poor engineer had every right to assume that the breakthroughs he'd seen over his entire working life were on track to continue. He lived at a time of accelerating technological change, where in ten years we had gone from propeller planes to lunar exploration.

The next generation of technology was not just a dream; it was already in the prototype stage.

But it all just kind of stopped.

We have a space station in 2014, but it's too embarrassing to talk about. Sometimes we send Canadians up there.

Never mind the Moon—we can't even launch astronauts into orbit anymore. If we want to go to our sad-sack space station, we have to ask the Russians, and they're mean to us.

Can you imagine the look in that engineer's eyes?

The technology was pointing in one direction, the future was clear and inevitable. And then it never happened. Why?

First, we ran into diminishing returns. As these planes got faster, they got more expensive to design and operate. Pushing all that air out of the way required exotic materials and vast amounts of fuel.

The space program was even worse. Those rockets used a lot of public money that could be better spent on bombing Vietnam.

Second, there were unexpected drawbacks. Economists have that great word, "externalities", for anything they find doesn't fit their model of the world. One externality of supersonic planes was the sonic boom. The Air Force spent six months flying supersonic over Oklahoma City to convince itself that the constant noise bothered people.

Another externality was that exhaust from SSTs damaged the ozone layer.

Boeing was genuinely surprised that people cared about this stuff. What does it matter if the sun is coming through your shattered window and burning your skin, if you can have a supersonic airliner? But it wasn't worth it!

Because the technologies we had were good enough. It turned out that very few people needed to cross an ocean in three hours instead of six hours. On my way to this conference, I flew from Switzerland to San Francisco. It took eleven hours and cost me around a thousand dollars. It was a long flight and kind of uncomfortable and boring. But I crossed the planet in half a day!

Being able to get anywhere in the world in a day is really good enough. We complain about air travel but consider that for a couple of thousand dollars, you can go anywhere, overnight.

The people designing the planes of tomorrow got so caught up in the technology that they forgot to ask the very important question, “what are we building this for?”

Today I hope to persuade you that the same thing that happened to aviation is happening with the Internet. Here we are, fifty years into the computer revolution, at what feels like our moment of greatest progress. The outlines of the future are clear, and oh boy is it futuristic.

But we're running into physical and economic barriers that aren't worth crossing.

We're starting to see that putting everything online has real and troubling social costs.

And the devices we use are becoming 'good enough', to the point where we can focus on making them cheaper, more efficient, and accessible to everyone.

So despite appearances, despite the feeling that things are accelerating and changing faster than ever, I want to make the shocking prediction that the Internet of 2060 is going to look recognizably the same as the Internet today.

Unless we screw it up.

And I want to convince you that this is the best possible news for you as designers, and for us as people.

The defining feature of our industry since the invention of the transistor has been exponential growth.

Exponential growth is one of those buzzwords that has an exact technical meaning. It just means that something keeps doubling, over and over again. Pop science authors never get tired of telling us that we have poor intuitions for exponential growth.

For example, here is Britney Gallivan posing with a sheet of paper folded 11 times.

If she could fold that sheet 50 times, the paper stack would reach nearly to the Sun. And it would be half a proton in diameter.

(It's folding that last proton that's really hard.)

This example illustrates the two things you need to know about exponential growth: it lets you get to large numbers very quickly. And it always runs into physical barriers.

I'm sure you have heard of Moore's Law. In its original form, it says "the number of transistors we can mass-produce on a silicon wafer doubles" every year or two. Moore made this observation in 1965, and it's held up ever since.

There's a popular understanding of Moore's Law, too, which says that "computers always get faster and more capable".

For fifty years we've ridden that wave. If you were active in the 1990's or 2000's, you may remember the feeling. You would buy a new computer, and a few months later there would be a better model, twice as fast, for the same price.
In those days there was an arms race between Intel and AMD, the main consumer chip manufacturers. Intel would release a 1 GHz processor, and AMD would follow with a 1.1 GHz rival.

CPUs were defined by clock speed. The speeds kept going up. Until suddenly, around 2005, there was a hitch.

Intel had been working on a monster 7 GHz chip. The problem was how much heat this chip generated, 150 watts, or as much as an E-Z Bake oven.

150 watts is the kind of light bulb that you get in trouble for having in college, because it threatens to set your Bob Marley poster on fire.

Deterred by all this heat, Intel changed strategy. Instead of making the CPUs smaller, hotter, and faster, they would just start putting more of them on each wafer.

Suddenly we had 'cores'. Your software didn't just automatically get faster with each generation anymore. Now it had to be written in a way that could use these 'cores', which programmers are still grappling with.

So while Moore's Law still technically holds—the number of transistors on a chip keeps increasing—its spirit is broken. Computers don't necessarily get faster with time. In fact, they're getting slower!

This is because we're moving from desktops to laptops, and from laptops to smartphones. Some people are threatening to move us to wristwatches.

In terms of capability, these devices are a step into the past. Compared to their desktop brethren, they have limited memory, weak processors, and barely adequate storage.

And nobody cares, because the advantages of having a portable, lightweight connected device are so great. And for the purposes of taking pictures, making calls, and surfing the internet, they've crossed the threshold of 'good enough'.

What people want from computers now is better displays, better battery life and above all, a better Internet connection.

Something similar happened with storage, where the growth rate was even faster than Moore's Law. I remember the state-of-the-art 1MB hard drive in our computer room in high school. It cost a thousand dollars.

Here's a photo of a multi-megabyte hard drive from the seventies. I like to think that the guy in the picture didn't have to put on the bunny suit, it was just what he liked to wear.

Modern hard drives are a hundred times smaller, with a hundred times the capacity, and they cost a pittance. Seagate recently released an 8TB consumer hard drive.

But again, we've chosen to go backwards by moving to solid state storage, like you find in smartphones and newer laptops. Flash storage sacrifices capacity for speed, efficiency and durability.

Or else we put our data in 'the cloud', which has vast capacity but is orders of magnitude slower.

These are the victories of good enough. This stuff is fast enough.

Intel could probably build a 20 GHz processor, just like Boeing can make a Mach 3 airliner. But they won't. There's a corrollary to Moore's law, that every time you double the number of transistors, your production costs go up. Every two years, Intel has to build a completely new factory and production line for this stuff. And the industry is turning away from super high performance, because most people don't need it.

The hardware is still improving, but it's improving along other dimensions, ones where we are already up against hard physical limits and can't use the trick of miniaturization that won us all that exponential growth.

Battery life, for example. The limits on energy density are much more severe than on processor speed. And it's really hard to make progress. So far our advances have come from making processors more efficient, not from any breakthrough in battery chemistry.

Another limit that doesn't grow exponentially is our ability to move information. There's no point in having an 8 TB hard drive if you're trying to fill it over an AT&T network. Data constraints hit us on multiple levels. There are limits on how fast cores can talk to memory, how fast the computer can talk to its peripherals, and above all how quickly computers can talk to the Internet. We can store incredible amounts of information, but we can't really move it around.

So the world of the near future is one of power constrained devices in a bandwidth-constrained environment. It's very different from the recent past, where hardware performance went up like clockwork, with more storage and faster CPUs every year.

And as designers, you should be jumping up and down with relief, because hard constraints are the midwife to good design. The past couple of decades have left us with what I call an exponential hangover.

Our industry is in complete denial that the exponential sleigh ride is over. Please, we'll do anything! Optical computing, quantum computers, whatever it takes. We'll switch from silicon to whatever you want. Just don't take our toys away.

But all this exponential growth has given us terrible habits. One of them is to discount the present.

When things are doubling, the only sane place to be is at the cutting edge. By definition, exponential growth means the thing that comes next will be equal in importance to everything that came before. So if you're not working on the next big thing, you're nothing.

This leads to a contempt for the past. Too much of what was created in the last fifty years is gone because no one took care to preserve it.

Since I run a bookmarking site for a living, I've done a little research on link rot myself. Bookmarks are different from regular URLs, because presumably anything you've bookmarked was once worth keeping. What I've learned is, about 5% of this disappears every year, at a pretty steady rate. A customer of mine just posted how 90% of what he saved in 1997 is gone. This is unfortunately typical.

We have heroic efforts like the Internet Archive to preserve stuff, but that's like burning down houses and then cheering on the fire department when it comes to save what's left inside. It's no way to run a culture. We take better care of scrap paper than we do of the early Internet, because at least we look at scrap paper before we throw it away.

This contempt for the past also ignores the reality of our industry, which is that we work almost exclusively with legacy technologies.

The operating system that runs the Internet is 45 years old.

The protocols for how devices talk to each other are 40 years old.

Even what we think of as the web is nearing its 25th birthday.

Some of what we use is downright ancient—flat panel displays were invented in 1964, the keyboard is 150 years old.

The processor that's the model for modern CPUs dates from 1976.

Even email, which everyone keeps trying to reinvent, is nearing retirement age.

I cheated by calling this talk 'Web Design: The First 100 years' because we're already nearly halfway there. However dismissive we are of this stuff, however much we insist that it will get swept away by a new generation of better technology, it stubbornly refuses to go. Our industry has deep roots in the past that we should celebrate and acknowledge.

The flip side of our disregard for the past is a love of gratuitous change. Any office worker who uses Microsoft products knows this pain. At some point fairly early on, Microsoft Office became good enough. Windows became good enough.

But that hasn't stopped Microsoft from constantly releasing new versions, and forcing people to upgrade. I pick on Microsoft because so many of us have experience with their software, but this holds true for any software vendor.

Consider the war Microsoft is waging against XP users. After years of patching, XP became a stable, beloved, and useful operating system. A quarter of desktops still run it.

This is considered a national crisis.

Rather than offer users persuasive reasons to upgrade software, vendors insist we look on upgrading as our moral duty. The idea that something might work fine the way it is has no place in tech culture.

A further symptom of our exponential hangover is bloat. As soon as a system shows signs of performance, developers will add enough abstraction to make it borderline unusable. Software forever remains at the limits of what people will put up with. Developers and designers together create overweight systems in hopes that the hardware will catch up in time and cover their mistakes.

We complained for years that browsers couldn't do layout and javascript consistently. As soon as that got fixed, we got busy writing libraries that reimplemented the browser within itself, only slower.

It's 2014, and consider one hot blogging site, Medium. On a late-model computer it takes me ten seconds for a Medium page (which is literally a formatted text file) to load and render. This experience was faster in the sixties.

The web is full of these abuses, extravagant animations and so on, forever a step ahead of the hardware, waiting for it to catch up.

This exponential hangover leads to a feeling of exponential despair.

What's the point of pouring real effort into something that is going to disappear or transform in just a few months? The restless sense of excitement we feel that something new may be around the corner also brings with it a hopelessness about whatever we are working on now, and a dread that we are missing out on the next big thing.

The other part of our exponential hangover is how we build our businesses. The cult of growth denies the idea that you can build anything useful or helpful unless you're prepared to bring it to so-called "Internet scale". There's no point in opening a lemonade stand unless you're prepared to take on PepsiCo.

I always thought that things should go the other way. Once you remove the barriers of distance, there's room for all sorts of crazy niche products to find a little market online. People can eke out a living that would not be possible in the physical world. Venture capital has its place, as a useful way to fund long-shot projects, but not everything fits in that mold.

The cult of growth has led us to a sterile, centralized web. And having burned through all the easy ideas within our industry, we're convinced that it's our manifest destiny to start disrupting everyone else.

I think it's time to ask ourselves a very designy question: "What is the web actually for?"

I will argue that there are three competing visions of the web right now. The one we settle on will determine whether the idiosyncratic, fun Internet of today can survive.


This is the correct vision.

The Web erases the barrier of distance between people, and it puts all of human knowledge at our fingertips. It also allows us to look at still images and videos of millions of cats, basically all of it for free, from our homes or a small device we carry in our pocket.

No one person owns it, no one person controls it, you don't need permission to use it. And the best part is, you are encouraged to contribute right back. You can post your own cat pictures.

Why is this not enough?

The feline vision of the Internet is fundamentally a humble one, because it does not presume that developers and designers know what they are doing. There are no limits on what people (and cats) can get up to once you link them together. On a planet of seven billion people and millions of cats, the chance that you are going to be able to think of all the best ideas is zero. Someone is always going to come up with something you never expected. A web that connects people in a way where they can contribute gives its authors a chance to be surprised.

We've seen this play out time and again, in that the most productive and revolutionary aspects of web culture came out of left field. The idea of a free, universally editable encyclopedia sounded insane. The idea that a free operating system could run half the Internet was insane. That volunteers in blog comments could write collaborative math papers with some of the most brilliant mathematicians in the world sounded insane.

A currency based entirely on cryptographic hashing still sounds insane, but it sure is interesting.

Even the world wide web itself is the product of a physics nerd winging it, and convincing his colleagues to try out something new.

The Internet is full of projects big and small whose defining trait is that they came out of nowhere and captured people's imaginations. It's also full of awesome cat videos. The key part of this vision is that the Internet succeeds by remaining open and participatory. No one acts as gatekeeper, and it is not just a channel for mindless consumption.


This is the prevailing vision in Silicon Valley.

The world is just one big hot mess, an accident of history. Nothing is done as efficiently or cleverly as it could be if it were designed from scratch by California programmers. The world is a crufty legacy system crying out to be optimized.

If you have spent any time using software, you might recognize this as an appalling idea. Fixing the world with software is like giving yourself a haircut with a lawn mower. It works in theory, but there's no room for error in the implementation.

This vision holds that the Web is only a necessary first step to a brighter future. In order to fix the world with software, we have to put software hooks into people's lives. Everything must be instrumented, quantified, and networked. All devices, buildings, objects, and even our bodies must become "smart" and net-accessible.

Then we can get working on optimizing the hell out of life.

Marc Andreessen has this arresting quote, that ‘software is eating the world.’ He is happy about it. The idea is that industry after industry is going to fall at the hands of programmers who automate and rationalize it.

We started with music and publishing. Then retailing. Now we're apparently doing taxis. We're going to move a succession of industries into the cloud, and figure out how to do them better. Whether we have the right to do this, or whether it's a good idea, are academic questions that will be rendered moot by the unstoppable forces of Progress. It's a kind of software Manifest Destiny.

To achieve this vision, we must have software intermediaries in every human interaction, and in our physical environment.

But what if after software eats the world, it turns the world to shit?

Consider how fundamentally undemocratic this vision of the Web is. Because the Web started as a technical achievement, technical people are the ones who get to call the shots. We decide how to change the world, and the rest of you have to adapt.

There is something quite colonial, too, about collecting data from users and repackaging it to sell back to them. I think of it as the White Nerd's Burden.

Technological Utopianism has been tried before and led to some pretty bad results. There's no excuse for not studying the history of positivism, scientific Marxism and other attempts to rationalize the world, before making similar promises about what you will do with software.

Like everything in tech, there is prior art!

And then there's the third vision of the Internet:


This is the insane vision. I'm a little embarrassed to talk about it, because it's so stupid. But circumstances compel me.

In this vision, the Internet and web are just the first rung of a ladder that leads to neural implants, sentient computers, nanotechnology and eventually the Singularity, that mystical moment when progress happens so quickly that all of humanity's problems disappear and are replaced, presumably, with problems beyond our current understanding.

This is the vision of 'accelerating returns', very reminiscent to that hockey stick graph I showed earlier, where we were supposed to have interstellar travel by 2010.

This Apocalyptic vision of the Internet and technical progress has captured the imaginations of some of the most influential people in our industry.

Grown adults, people who can tie their own shoes and are allowed to walk in traffic, seriously believe that we're walking a tightrope between existential risk and immortality.

Some of them are the most powerful figures in our industry, people who can call up Barack Obama about the dangers of nanotechnology, and Obama has to say “Michelle, I need to take this.”

“Barack, it is three o'clock in the morning."

“I know, but this guy is scared of sentient artificial intelligence and he's a huge contributor.”

And then Obama just has to sit there and listen to this shit.

So because powerful people in our industry read bad scifi as children, we now confront a stupid vision of the web as gateway to robot paradise.

Here's Ray Kurzweil, a man who honestly and sincerely believes he is never going to die. He works at Google. Presumably he stays at Google because he feels it advances his agenda.

Google works on some loopy stuff in between plastering the Internet with ads.

And here is Elon Musk, the founder of PayPal, builder of rockets and electric cars. Musk has his suitcase packed for the robot rebellion:

“The risk of something seriously dangerous happening is in the five year timeframe. 10 years at most.”

“With artificial intelligence we are summoning the demon. In all those stories where there’s the guy with the pentagram and the holy water, it’s like – yeah, he’s sure he can control the demon. Doesn’t work out.”

“We need to be super careful with AI. Potentially more dangerous than nukes.”

“Hope we're not just the biological boot loader for digital superintelligence. Unfortunately, that is increasingly probable.”

Let me give you a little context here. This little fellow is Caenorhabditis elegans , a nematode worm that has 302 neurons. The absolute state of the art in simulating intelligence is this worm. We can simulate its brain on supercomputers and get it to wiggle and react, althogh not with full fidelity.

And here I'm talking just about our ability to simulate. We don't even know where to start when it comes to teaching this virtual c. elegans to bootstrap itself into being a smarter, better nematode worm.

In fact, forget about worms—we barely have computers powerful enough to emulate the hardware of a Super Nintendo.

If you talk to anyone who does serious work in artificial intelligence (and it's significant that the people most afraid of AI and nanotech have the least experience with it) they will tell you that progress is slow and linear, just like in other scientific fields.

But since unreasonably fearful people helm our industry and have the ear of government, we have to seriously engage their stupid vision.

I've taken the liberty of illustrating Musk's greatest fear.

At best, having the the top tiers of our industry include figures who believe in fairy tales is a distraction. At worst, it promotes a kind of messianic thinking and apocalyptic Utopianism that can make people do dangerous things with all their money.

These three visions lead to radically different worlds.

If you think the Web is a way to CONNECT KNOWLEDGE, PEOPLE, AND CATS, then your job is to get the people and cats online, put a decent font on the knowledge, and then stand back and watch the magic happen.

If you think your job is to FIX THE WORLD WITH SOFTWARE, then the web is just the very beginning. There's a lot of work left to do. Really you're going to need sensors in every house, and it will help if everyone looks through special goggles, and if every refrigerator can talk to the Internet and confess its contents.

You promise to hook up all this stuff up for us, and in return, we give you the full details of our private lives. And we don't need to worry about people doing bad things with it, because your policy is for that not to happen.

And if you think that the purpose of the Internet is to BECOME AS GODS, IMMORTAL CREATURES OF PURE ENERGY LIVING IN A CRYSTALLINE PARADISE OF OUR OWN INVENTION, then your goal is total and complete revolution. Everything must go.

The future needs to get here as fast as possible, because your biological clock is ticking!

The first group wants to CONNECT THE WORLD.

The second group wants to EAT THE WORLD.

And the third group wants to END THE WORLD.

These visions are not compatible.

I realize this all sounds a little grandiose.You came here to hear about media selectors, not aviation and eschatology.

But you all need to pick a side.

Right now there's a profound sense of irreality in the tech industry. All problems are to be solved with technology, especially the ones that have been caused with previous technology. The new technologies will fix it.

We see businesses that don't produce anything and run at an astonishing loss valued in the billions of dollars.

We see a whole ecosystem of startups and businesses that seem to exist only to serve one other, or the needs of very busy and very rich tech workers in a tiny sliver of our world.

At the same time, we hear grandiose promises about how technology will fundamentally improve the lives of every person on Earth, even though that contradicts our own experience of the last thirty years.

There is something fishy about all this promised progress. The engine is revving faster and faster, we can see that the accelerator is pegged, but somehow the view out the window never changes.

When we point out that Silicon Valley doesn't seem to be engaging the real world, that wages have been flat for thirty years, that Utopia seems further away than it's been in a generation, we get impatient excuses.

Tech culture is like a deadbeat who lives on your basement sofa. You ask him:

“When are you going to do all those things you promised?”

“Oh, wait until everyone has a computer.”

“They do.”

“Okay, I mean wait until they're all online. ”

“They are. Why isn't the world better?”

“Well, wait until they all have smartphones... and wearable devices,” and the excuses continue.

The real answer is, technology hasn't changed the world because we haven't cared enough to change it.

There's a William Gibson quote that Tim O'Reilly likes to repeat: "the future is here, it's just not evenly distributed yet."

O'Reilly takes this to mean that if we surround ourselves with the right people, it can give us a sneak peek at coming attractions.

I like to interpret this quote differently, as a call to action. Rather than waiting passively for technology to change the world, let's see how much we can do with what we already have.

Let's reclaim the web from technologists who tell us that the future they've imagined is inevitable, and that our role in it is as consumers.

The Web belongs to us all, and those of us in this room are going to spend the rest of our lives working there. So we need to make it our home.

We live in a world now where not millions but billions of people work in rice fields, textile factories, where children grow up in appalling poverty. Of those billions, how many are the greatest minds of our time? How many deserve better than they get? What if instead of dreaming about changing the world with tomorrow's technology, we used today's technology and let the world change us? Why do we need to obsess on artificial intelligence, when we're wasting so much natural intelligence?

When I talk about a hundred years of web design, I mean it as a challenge. There's no law that says that things are guaranteed to keep getting better.

The web we have right now is beautiful. It shatters the tyranny of distance. It opens the libraries of the world to you. It gives you a way to bear witness to people half a world away, in your own words. It is full of cats. We built it by accident, yet already we're taking it for granted. We should fight to keep it!


You Will Also Love Reading:

The Internet With A Human Face (on surveillance, centralization, and why 'investor storytime' is not a viable economic foundation for the web)

Our Comrade The Electron (the astonishing life of Lev Termen, a rant on surveillance, and how your iPhone is descended from the theremin.)

Idle Words > Talks