Skip to content. Skip to more web exclusives. Skip to most popular. Skip to footer.


Hit Forward: The Internet's Next 40 Years

"We'll have holograms. And we'll have microchip brain implants. Right now, I have to look something up on iPhone. In the future, I'll just think it." On the Internet's 40th b-day, UCLA's great thinkers predict the next 40 years.


By Jack Feuer

Published Oct 27, 2009 3:16 PM


On Oct. 29, 1969, the future began. That was the day that a UCLA team led by distinguished professor of computer science Leonard Kleinrock sent the very first message over the ARPANET, the computer network we now know as the Internet.

It was the day everything changed.

To mark the 40th anniversary of this world-altering moment, the UCLA Henry Samueli School of Engineering and Applied Science will hold a daylong celebration and forum starring some of cyberspace's most influential leaders, activists and analysts on Oct. 29. They'll dissect the birth of the Digital Age and talk about the challenges that lie ahead. (Watch the live video here.)

But what if we went even further and peered ahead 40 years? Four decades from now, will human life on this planet be as thoroughly transformed by new technology as it has been since the birth of the Internet in 1969?

40 years from now: Viewpoints from across campus

Partipants in alphabetical order

Jeffrey Aaron Burke
∙ Executive Director
∙ UCLA Center for Research in Engineering, Media and Performance

Leonard Kleinrock
∙ Distinguished Professor of Computer Science
∙ UCLA Henry Samueli School of Engineering and Applied Science
∙ Inventor, Internet technology packet switching

Courtney Lyder
∙ Dean, UCLA School of Nursing
∙ Senior Consultant, U.S. Department of Health and Human Services

Dario Nardi
∙ Instructor, Artificial Intelligence Honors Collegium course, UCLA Anthropology Department

Todd Presner
∙ UCLA Associate Professor of Germanic Languages, Comparative Literature and Jewish Studies
∙ Founder and Director, HyperCities

Casey Reas
∙ Professor, Department of Design | Media Arts, UCLA School of the Arts and Architecture

Gary Small
∙ Director, UCLA Memory & Aging Research Center
∙ Professor of Clinical Psychiatry, Semel Institute for Neuroscience & Human Behavior
∙ Author, "iBrain: Surviving the Technological Alteration of the Modern Mind."

Arthur Toga
∙ Director, UCLA Laboratory of Neuro Imaging
∙ Professor, David Geffen School of Medicine at UCLA

Christopher Tilly
∙ Professor of Urban Planning
∙ Director of the UCLA Institute for Research on Labor and Employment

We asked an august group of Bruin thinkers from across the campus to tell us where they think we're at in the new digital age and from that benchmark, hit forward and explore what personal and professional life will be like in 2049. They represent many fields of inquiry and excellence, including neuroscience, the Humanities, sensor technology, artificial intelligence, nursing, media arts — and then we went to the source and asked Professor Kleinrock where he thinks it's all going.

Here, then, is a sampling of what our experts see when they peer into tomorrow.


Nardi: Artificial intelligence (AI) is around us but we still don't "see" it. There's no R2D2 or C3PO and I didn't think there would be. Intelligent routing, packet switching, etc. — nuts-and-bolts, behind-the-scenes stuff we might not think of as AI but is in fact intelligence placed into machines.

Toga: We can combine and share brain images collected all over the world. The sheer numbers now make it possible to make maps that illustrate trends and differences between groups that would have been impossible without a globally distributed effort.

Kleinrock: In a press release that came out of UCLA months before the birth of the Internet, I am quoted as predicting the following: "As of now, computer networks are still in their infancy, but as they grow up and become more sophisticated, we will probably see the spread of 'computer utilities' which, like present electric and telephone utilities, will service individual homes and offices across the country." I am pleasantly surprised at how that comment anticipated the emergence of today's web-based IP services … the ability to plug in anywhere to an always on and "invisible" network … [with] ubiquitous access.

Lyder: The Internet has significantly influenced what the patients and their families know about health and their healthcare. It has forced nurses to use the Internet in ways we didn't realize we were going to need to … instead of getting our nursing and health information from journals, we now get it from the Internet.

Reas: The freedom of open source software and hardware has enabled new hybrid communities to emerge. The increased access to technical information and specialized communities has enabled a new generation of hybrid designer/programmers to flourish.

Tilly: The Internet has made it a lot easier to relocate and disperse labor by location, making telecommuting and other working arrangements across long distances possible, but it goes beyond that as companies can now outsource operations all over the globe. Companies can also run their businesses practically on a 24/7 basis, for example, with call centers in other countries to answer questions. The Internet as a whole has not made life worse or better for workers, but it's following a new geographic pattern. India, for example, has seen tremendous opportunities for growth in terms of being able to provide English-speaking clerical to professional work, but these jobs have shifted from other countries, like the United States. There are going to continue to be huge numbers of jobs that are going to be outsourced from this country.

The Internet has also made it possible for workers to organize and spread support through a global network. For example, in Pakistan, workers at the Lipton Tea company were able to garner support for their positions by launching an email campaign that went around the world, and ended up being quite effective. With these tools, workers are winning some victories.


Reas: I didn't think I'd get sucked up into the social networking vortex and I think I'm absolutely not alone. When did all of the middle-aged folks and grandparents show up on Facebook, and why?

Lyder: Because of the wealth of information available, I thought there might be some oversight to ensure that information would be accurate and current. We do spend quite a bit of time convincing patients that the information they have is not always the most accurate or appropriate.

The First Internet Connection

Hear about the Internet's day of birth from one of the men who was there in this video of Leonard Kleinrock.

See what else UCLA is doing for the Internet's big four-oh.

Learn about the birth of the Internet in this article from UCLA Today.

Get the program for the Internet's birthday conference taking place Oct. 29.

Live stream the conference on Oct. 29.

Burke: A lot of the computing and media capability that I was expecting to end up embedded in the environment has ended up in mobile phones. What will be interesting to watch now is what types of environmental computing emerges to support and complement mobile experience.

Small: I was struck by how little has been done to measure how the brain has been exposed to these technologies. That led to our study that shows that technology does have a huge impact on brain activity. We found that when people searched online for the first time, it was very similar to reading a book page — very little activity. But if they had prior experience [searching the Web], there was a twofold increase in brain activity. Particularly in the front part of the brain, the thinking part. Makes sense. What I didn't see coming was the way the mass media would interpret our results. The headline was, "Google is making us smart." All we found was that it's kind of a built-in brain exercise. When you search online, you can slow it down or ratchet it up. That's the brilliance of technology.

Kleinrock: My original view was that the network would be for the purpose of computer-to-computer interaction as well as people-to-computer interaction and has it has turned out, its dominant use is for people-to-people interaction. This was first manifested in the rapid and extensive growth of e-mail traffic as soon as it was introduced in 1972, and has since been followed by a long sequence of surprising and explosive applications such as YouTube, Facebook and Twitter.


Toga: It's a much smaller place. More places, things and people will be made similar. Communication will encourage commonality.

Presner: At the very least, it will suffuse every realm of human experience — this will probably happen in 2019. There will be a web of information merging with physical reality. You click on trees and get information. Second Life becomes real life. You're not quite able to fly but information lives in things. You don't just go to a computer and log on.

Reas: There will probably be no distinction between "digital life" and "life." Digital computers will be over 100 years old at that point and will most likely be fully integrated into the environment. I think we all know the next frontier is "bio."

Lyder: While the world will continue to get smaller, in-person social interaction will be less. We will see a lot more isolation. Because digital technologies will do so much for us, we will be leading more unhealthy lives.

Small: We're going to continue this pattern for better or worse where there's not much demarcation between our work and play life. Also, we'll have virtual holograms — when you want to communicate with someone, the computer will simulate the image of that individual and it will be so expressive that your brain will not be able to detect the difference. And we'll have brain implants, little microchips that will allow us to use external hard drives at will. Right now, I have to look something up on iPhone. In the future, I'll just think it. That's all within the realm of possibility.

Nardi: Whatever comes, we're going to see three things: more smartness in machines, more robots and more cybernetics. How much is hard to say. In 2049, we'll have electronic duplicates of human brains … then maybe add an android body and hook up various sensors and actuators to move and so on. The result is you, a lot of you, perhaps better in some ways and diminished in others. We're going to run up against some interesting cultural barriers. People don't mind technology if they can pull the plug, so to speak. People are open to robots generally. But a lot of people fear technology that crosses biological thresholds. I love my iPhone, but do I want it linked to my auditory nerve in my brain? We're not faced with any of this quite yet, but we will be. And don't be surprised if society decides to turn away from certain paths.

Kleinrock: Video addiction is pervasive, mass customization is prevalent, extreme mobility coupled with location-based services is the norm. Cyberspace has finally escaped from being trapped in the computer screens that we all peer at and has moved out into our physical environment. I see small, pervasive devices ubiquitously embedded in the physical world, providing the capabilities of actuators, sensors, logic, memory, processing, communicators, cameras, microphones, speakers, displays, RFID [radio-frequency identification] tags, etc. Our environment is alive with technology — in the walls, in our desks, clothes, eyeglasses, wristwatches, lamps, refrigerators, vehicles, hotel rooms, fingernails and in our bodies.

Tilly: The Internet has also made it possible for clumps of industries, or "agglomerations," to disperse over distances, but I think there are serious limits to how much this will happen. No matter what the industry, there are benefits to proximity, whether it's sharing ideas or being able to touch and feel things in person. Many sectors are likely to follow the model of a Hollywood, a Wall Street, or an Italian shoe manufacturing center, where similar companies are located within close range of one another.

What do you think life will be like in 2049? Post your predictions in the Facebook Comments section below.



Most Tweeted

    Range: Jan 03, 2021 – Present

      Range: Dec 04, 2020 – Present

        Range: Oct 20, 2020 – Present