Skip to content. Skip to more features. Skip to most popular. Skip to footer.


Digital Privacy


By Scott Fields

Published Jan 1, 2012 12:00 AM

Technology captures every aspect of our lives and stores it forever. Who uses that information? What do they use it for? Can we protect ourselves? The debate rages on campuses, in boardrooms, in government and research institutions. UCLA leads the way in exploring the boundaries of privacy in a digital world — and how policymakers should respond.


Illustration by Heads of State

If you're online, you are being tracked. Every time. Everywhere.

If you use a cell phone, the Internet, credit cards, GPS navigation or a host of other conveniences, every datum that exists about you, me and billions of people around the planet is a digital open book — for technology companies, law enforcement, marketing departments, politicians, all your Friends and a myriad of other online interests. And on all sides of the issue, Bruins are deeply engaged in drawing privacy boundaries that will bring the most benefit to society with the lowest possible security risk.

"If the last decade in information management was about technical security, this next decade is about incorporating privacy into that security," says Tracy Mitrano, IT policy director at Cornell University and a leading expert in privacy issues. "UCLA is a pioneer in this area, not only for the UC system, but for American education and the country at large."

"We've never had the ability to share on such a huge scale," says Google vice president and chief Internet evangelist Vinton Cerf M.S. '70, Ph.D. '72. "But we haven't figured out what the social norms are … it's too early. In terms of what the regulatory postures should be — how to protect people from harm — we're literally just having to live through it."

Privacy is Relative

Using Gmail, Google search, Google Maps and YouTube may make our lives easier, but Google — which includes all four services and over the years has relentlessly sought global market share in every area of digital life — has become the poster child for potential misuse of online data. With the Internet giant's 2008 acquisition of DoubleClick, the largest supplier of ads on ad performance on the Web, Google has a crystal-clear picture of where we've been, what we've bought and what we've searched for — all in one place.

And lately, of course, there's Facebook, also constantly provoking privacy concerns, for very good reason. Facebook now captures information about users even after they've left the site.

"We may all be feeling a little invaded, but then Facebook says, ‘No, we're protecting you with our security measures, we're doing the right thing,' but then the public pushes back a bit," explains UCLA Information Studies Professor Leah Lievrouw, a member of the campus' Advisory Board on Privacy. "Public distrust has to happen again and again. These companies just keep coming back, trying to gather more invasive information because it's so economically valuable to them."

Facebook may feel like a public space, but when you're on the site you're on private property that is very much part of a private enterprise. And remember: Private property laws apply.

"Facebook's 800 million users around the planet should understand that they are Facebook's product, not its customers," explains Kent Wada, UCLA's director of strategic information technology and privacy policy, who sits on the UCLA Advisory Board on Privacy as well as on the statewide UC Privacy and Information Security Initiative charged by UC President Mark Yudof. He is very involved in the latter's 18-month study to recommend a framework for balancing privacy, information security and other values and obligations in the unique context of UC. "Facebook's customers are the companies that pay them to put ads on their site."

Academic Freedom at Stake?

"Part of what's making privacy issues so urgent is that you want to maintain a freedom of inquiry despite the massive amount of information flowing around campus," says UCLA Information Studies Professor and Presidential Chair Christine Borgman, a board member of the Electronic Privacy Information Center in Washington, D.C. "We value the ability to read what we want, to write grant proposals, to do research, to have conversations … [but] "We also have to know our files are secure … Good security protects privacy."

Information Technology Vice Provost Jim Davis notes that "as a public institution, we are subject to transparency and accountability. But how do we achieve that and, at the same time, provide an intellectual environment where people don't feel invaded or constricted?"

For example, "We've had requests for research about our behavior on Google," Davis says. "The UCLA Advisory Board on Privacy and Data Protection ruled that while we will make ourselves available to be studied, it can't happen without prior notification."

University Librarian Gary Strong has been involved with many privacy questions and, according to his Advisory Board on Privacy associates, has been formidable in standing up for privacy rights.

"We must ensure due process when it comes to law enforcement inquiries," Strong says. "It's not that information can't be requested, but the police can't be allowed to simply come in here and demand to know what you're reading, what you've checked out, what you did on one of our computer terminals. They have to produce a summons."

Defining Digital Privacy

Impacting the digital privacy debate, the Supreme Court has reduced privacy rights by ruling that individuals have no Fourth Amendment protection in public — in other words, an individual in public has no protection against a variety of surveillance measures or against the government looking at a person's digital records with third-party companies like ISPs, online merchants or credit card vendors.

Privacy expert and George Washington University Professor of Law Daniel Solove, who has provided insight to the statewide UC initiative, told the ACLU's Blog of Rights that "Courts and legislators often are too deferential to whatever measures security officials propose … when privacy is balanced against security, security often wins because the well-being of the many outweighs the interests of one person. This is a faulty way to see privacy."

The development of GPS and Wi-Fi-based technology built into our cell phones, concurrent with 9/11 and subsequent passage of the USA Patriot Act, has prompted law enforcement to engage in what is called "pattern mining," a new interdisciplinary field in which new patterns are pulled from large amounts of information using various techniques, including statistics, database management and even artificial intelligence. A lot of interests use data mining, including scientists, researchers, marketers and the government, which uses the technique to monitor suspected terrorists.

Some believe data mining casts too broad a net to ensure privacy, for the same reasons that make activists, lawmakers and others uneasy about the propagation of surveillance cameras everywhere.

"You're photographed [constantly] when you're out in the world," says Dana Cuff, professor of architecture and urban design and member of Westwood's Advisory Board on Privacy. "One side of the debate is that there's just no way to sort through all that — it's all just noise. But in fact, there really is a way. Look at how traffic monitoring by video is working. We get very accurate traffic data."

Who Are You?

Identity theft victimized more than 8.1 million U.S. adults in 2010, at a total cost of $37 billion. The thieves used their victims' identities to open new credit card accounts, take out auto loans and commit crime.

Google's Cerf is considered one of the "Fathers of the Internet," due to his work at UCLA in the early design of the ARPANET. He's world-renowned for his predictions on how technology will affect future society. As an example, he cites the development of new tools that enable automatic digital recognition of individuals.

"What happens when you take a picture and quite incidentally pick up an image of a person you don't know?" Cerf asks. "Let's say it's a guy named Joe. Suddenly Joe is identified, and he has no idea it's getting posted … and maybe he never wanted anyone to know he was there."

As for the giant company he works for, Cerf, who has been on the executive team at Google since 2005, contends that Google's analytics capability offers advertisers "a rich scope of functionality to help them find out what's working and what to do next," and "advertising is where Google's revenue comes from. The revenue allows us to build very large-scale infrastructure, multiple data centers … and large-scale networking capability and capacity."

Cerf also points out that Google has gone to tremendous lengths to provide transparency, such as with its extensive privacy dashboard that lists what data has been collected. "What I find remarkable as an engineer, not as a businessperson, is the amount of analytic information that we're able to offer that does not invade privacy."

Lauren Weinstein '76, also an early ARPANET pioneer who is the co-founder of global watchdog group People for Internet Responsibility as well as a well-known columnist, commentator and blogger on Internet-related issues, warns against too-quick solutions coming from legislators and government regulators.

"Government ignored privacy issues for a long time, and then [after 9/11] it became politically popular," he says. "When politicians get involved in something like that, you have to watch your back."

Weinstein calls the Federal Trade Commission's push for "Do Not Track" rules curtailing the ability for online entities to track users "a bad idea as it's been formulated," he says. "You need to strike a balance in terms of privacy issues and economic issues and practical issues and political issues that are involved.

"Large-scale analysis of data is spawning all kinds of health-related innovation that help people and solve problems," he adds. Also, "If you prevent all collection of data, you obviously need an alternative funding model," for sites such as Google.

The Answer is Out There … Somewhere

Weinstein believes that the solution to the privacy issue lies somewhere in the middle between No Track Laws and an information flow with no barriers. "There are technologies and systems and ways to modify the data where you can collect it and use it to solve today's problems, but where it won't be privacy-invasive for individuals."

Cerf maintains that "better Internet safety" will only come with a change in behavior from all parties involved, from "software and apps creators who have the responsibility to make safe software," to users "drawing on improved methods for password authentification." He also mentions the need for improved training methods for programmers and for those who work in online security.

Perhaps there can't be an ultimate resolution because technology is so ever-changing. One thing seems clear: A solution will likely be found, not on one side or the other, but in the middle. And, of course, there's the bottom line to consider.

"How are you going to pay for this stuff?" Weinstein concludes. "Do you want to pay a nickel every time you do a search? You don't want to shoot the goose that lays the golden egg."