By Ariana Eunjung Cha Washington Post Foreign Service Saturday, May 29, 2010; A01
When Disa Powell's husband and brother were badly burned in an electrical explosion while conducting maintenance at a Wal-Mart store and the family sued, the defense went after something she never expected: her online life.
Through a subpoena seeking information about the men's injuries, Wal-Mart was able to gain full access to her Facebook and MySpace social-networking accounts -- every public and private message, contact and photo for the previous 2 1/2 years.
There were the pictures of Powell's newborn baby lying in a hospital bed after heart surgery (Label: "The hardest day of Mommy and Daddy's life"). The messages detailing problems with her pregnancy ("I got a bladder infection, which has moved to my kidneys"). And the messages dissing on friends ("Brad is a big fat BABY, and can't do anything by himself. The whole issue is that he's lazy").
"I was livid," said Powell, 35, a former hospital administrator who a few years ago moved from Maryland's Eastern Shore back to her home town in Oklahoma. "I felt like I had been seriously violated."
The case, which was settled out of court in January, offers a window into an issue that in recent weeks has riled members of Congress, consumer advocacy groups and tens of thousands of account holders: what your social-networking sites know about you and whom they share it with.
Many online service providers over the past few years have been building huge dossiers with minute details of each user's online activities -- a practice that isn't usually mentioned in privacy policies. Some companies anonymize the data, while others do not. Some store detailed data for a month, while others keep it for years.
At the same time, the ease with which outsiders can access the data is increasing, as corporations, insurance companies and parties in divorces or employment disputes make widespread use of subpoenas.
David Hersh, the attorney who represented the Powells and Disa's brother Joel Ledbetter, said such subpoenas have become standard practice in litigation and are "meant to discover information that would be embarrassing or might be used adversely even if it has nothing to do with the claim."
Companies own the data
Because your account information is stored on a company's servers, on the "cloud" that is the Internet rather than on your personal laptop, the company owns it, not you. While accessing your laptop may require a difficult-to-obtain search warrant, getting certain data on Facebook, MySpace, Meetup, LinkedIn and other social-networking sites' servers may require only a simple subpoena.
"The law in this area is really outdated. It's pre-'www,' " Christopher Calabrese, legislative counsel for the American Civil Liberties Union, said of the 1986 act that was designed to introduce privacy controls to electronic communications. "Back then nobody could even figure out whether an e-mail was more like a letter or a phone call."
Efforts to give consumers more control over their private information have accelerated in Washington over the past month, in the wake of a furor over privacy policy changes at Facebook in particular. (Washington Post Co. Chairman Donald E. Graham is on the board of directors at Facebook.) Facebook chief executive Mark Zuckerberg tried to quell the outcry this week by making it easier for users to control how they share data.
On Friday, Rep. John Conyers Jr. (D-Mich.), chairman of the House Judiciary Committee, wrote to Facebook and Google to demand that they cooperate with congressional investigators looking into privacy practices. Google has drawn scrutiny for accessing information including e-mails and surfing from open WiFi networks while photographing streets for its mapping service.
Sen. Charles E. Schumer (D-N.Y.) has called on the Federal Trade Commission to provide guidelines for use of private information and prohibit access without user permission. The ACLU is part of a coalition of advocacy groups and tech companies that is pushing for a major overhaul of the 1986 act.
Meanwhile, software developers are working on a way to prohibit access using technology. Four New York University students recently made headlines for a project they call Diaspora that they say will allow users to keep control over their social-networking information. The group was seeking $10,000 for its startup but has raised $190,000 since the Facebook controversy broke out in late April.
In the 15 years since the World Wide Web brought the Internet to the masses, the most successful companies have been those that collect information about users and use it to sell things. Google, for instance, has confirmed that it keeps track of search queries sent from a particular IP address. (A spokesman said the company anonymizes IP addresses associated with search queries after nine months and cookies after 18 months.)
Extensive data collection
Companies are loath to talk about what information they track, but internal compliance manuals for law enforcement for Facebook, Yahoo and Microsoft reviewed by The Washington Post show that their data collection is much more extensive than users might believe based on what they themselves can access.
For example: Microsoft tracks the Xbox LIVE start and end dates and times for game-playing and notes the game played, such as "SW: Jedi Academy." Yahoo keeps chat and instant messenger logs for 45 to 60 days and notes the time/date and IP address for when content is added or deleted to someone's profile or to its Flickr photo service.
Facebook's data collection is among the most detailed.
For every user id, Facebook keeps a log of the IP address that accessed the account, the date and time, and what exactly the user did -- clicking on an advertisement, looking at someone else's profile, posting a photo or sending a message to a friend, etc.
Facebook spokesman Andrew Noyes declined to comment on specific data-gathering and retention policies but said the privacy policy makes clear that the company may disclose information pursuant to subpoenas, court orders or other requests.
However, Noyes said, "We scrutinize every single information request; require a detailed description of why the request is being made; and, if it is deemed appropriate, share only the minimum amount of information."
Facebook says in its compliance manual that it generally retains information about activity by IP address for 90 days, but in the Ledbetter-Powell case it's clear that other information, such as her private messages to and from friends, had been kept since her account was opened in 2007.
Eben Moglen, a Columbia University law professor and director of Software Freedom Law Center, calls Facebook "one big database of hundreds of millions of people containing the kind of information far beyond what the secret police in 20th-century totalitarian regimes had."
The company knows which social contacts are closest to you and can guess your moods, he said. And if you're obsessively checking another person's profile at the same time he or she is doing the same with yours, Moglen claims, "Facebook can even tell you're going to have an affair before you do."
Research editor Alice Crites contributed to this report.
Sometime in the next few weeks, Facebook will officially log its 500 millionth active citizen. If the website were granted terra firma, it would be the world's third largest country by population, two-thirds bigger than the U.S. More than 1 in 4 people who browse the Internet not only have a Facebook account but have returned to the site within the past 30 days.
Just six years after Harvard undergraduate Mark Zuckerberg helped found Facebook in his dorm room as a way for Ivy League students to keep tabs on one another, the company has joined the ranks of the Web's great superpowers. Microsoft made computers easy for everyone to use. Google helps us search out data. YouTube keeps us entertained. But Facebook has a huge advantage over those other sites: the emotional investment of its users. Facebook makes us smile, shudder, squeeze into photographs so we can see ourselves online later, fret when no one responds to our witty remarks, snicker over who got fat after high school, pause during weddings to update our relationship status to Married or codify a breakup by setting our status back to Single. (I'm glad we can still be friends, Elise.) (See pictures of Facebook's headquarters.)
Getting to the point where so many of us are comfortable living so much of our life on Facebook represents a tremendous cultural shift, particularly since 28% of the site's users are older than 34, Facebook's fastest-growing demographic. Facebook has changed our social DNA, making us more accustomed to openness. But the site is premised on a contradiction: Facebook is rich in intimate opportunities — you can celebrate your niece's first steps there and mourn the death of a close friend — but the company is making money because you are, on some level, broadcasting those moments online. The feelings you experience on Facebook are heartfelt; the data you're providing feeds a bottom line.
The willingness of Facebook's users to share and overshare — from descriptions of our bouts of food poisoning (gross) to our uncensored feelings about our bosses (not advisable) — is critical to its success. Thus far, the company's m.o. has been to press users to share more, then let up if too many of them complain. Because of this, Facebook keeps finding itself in the crosshairs of intense debates about privacy. It happened in 2007, when the default settings in an initiative called Facebook Beacon sent all your Facebook friends updates about purchases you made on certain third-party sites. Beacon caused an uproar among users — who were automatically enrolled — and occasioned a public apology from Zuckerberg. (See how to delete your Facebook profile.)
And it is happening again. To quell the latest concerns of users — and of elected officials in the U.S. and abroad — Facebook is getting ready to unveil enhanced privacy controls. The changes are coming on the heels of a complaint filed with the Federal Trade Commission (FTC) on May 5 by the Electronic Privacy Information Center, which takes issue with Facebook's frequent policy changes and tendency to design privacy controls that are, if not deceptive, less than intuitive. (Even a company spokesman got tripped up trying to explain to me why my co-worker has a shorter privacy-controls menu than I do.) The 38-page complaint asks the FTC to compel Facebook to clarify the privacy settings attached to each piece of information we post as well as what happens to that data after we share it.
Facebook is readjusting its privacy policy at a time when its stake in mining our personal preferences has never been greater. In April, it launched a major initiative called Open Graph, which lets Facebook users weigh in on what they like on the Web, from a story on TIME.com to a pair of jeans from Levi's. The logic is that if my friends recommend something, I'll be more inclined to like it too. And because Facebook has so many users — and because so many companies want to attract those users' eyeballs — Facebook is well positioned to display its members' preferences on any website, anywhere. Less than a month after Open Graph's rollout, more than 100,000 sites had integrated the technology. (See five Facebook no-nos for divorcing couples.)
"The mission of the company is to make the world more open and connected," Zuckerberg told me in early May. To him, expanding Facebook's function from enabling us to interact with people we like on the site to interacting with stuff our friends like on other sites is "a natural extension" of what the company has been doing.
In his keynote announcing Open Graph, Zuckerberg said, "We're building a Web where the default is social." But default settings are part of the reason Facebook is in the hot seat now. In the past, when Facebook changed its privacy controls, it tended to automatically set users' preferences to maximum exposure and then put the onus on us to go in and dial them back. In December, the company set the defaults for a lot of user information so that everyone — even non-Facebook members — could see such details as status updates and lists of friends and interests. Many of us scrambled for cover, restricting who gets to see what on our profile pages. But it's still nearly impossible to tease out how our data might be used in other places, such as Facebook applications or elsewhere on the Web. (See TIME's video on how people of all ages are connecting through Facebook.)
There's something unsettling about granting the world a front-row seat to all of our interests. But Zuckerberg is betting that it's not unsettling enough to enough people that we'll stop sharing all the big and small moments of our lives with the site. On the contrary, he's betting that there's almost no limit to what people will share and to how his company can benefit from it.
Since the site expanded membership to high schoolers in 2005 and to anyone over the age of 13 in 2006, Facebook has become a kind of virtual pacemaker, setting the rhythms of our online lives, letting us ramp up both the silly socializing and the serious career networking. Zuckerberg's next goal is even more ambitious: to make Facebook a kind of second nervous system that's rapid-firing more of our thoughts and feelings over the Web. Or, to change the metaphor, Facebook wants to be not just a destination but the vehicle too.
"I'm CEO ... Bitch" Facebook's world headquarters in Palo Alto, Calif., looks like an afterthought, a drab office building at the end of a sleepy stretch of California Avenue. Lacking the scale of Microsoft's sprawling campus or the gleaming grandeur of Google HQ, Facebook's home base is unpretentious and underwhelming. The sign in front (colored red, not the company's trademark cobalt blue) features a large, boldface address with a tiny Facebook logo nestled above. (See 10 tech trends for 2010.)
Inside the building, Facebook crams in hundreds of employees, who work in big, open-air bullpens. Without cubicles or walls, there isn't much privacy, so each desk seems like, well, a Facebook profile — small, visible-to-all spaces decorated with photos and personal sundries. Zuckerberg spent the past year in a dimly lit bullpen on the ground floor. But perhaps in a concession to the fact that the CEO needs some privacy, the 26-year-old billionaire recently moved upstairs to a small office, albeit one with a glass wall so everyone can see what he's doing in there.
Steve Jobs has his signature black turtlenecks; Zuckerberg usually sports a hoodie. In Facebook's early years, he was the cocky coder kid with business cards that read, "I'm CEO ... Bitch." (Zuckerberg has said publicly they were a joke from a friend.) And elements of the Palo Alto headquarters — snack tables, Ping-Pong — still impart some semblance of that hacker-in-a-dorm-room feel. (See the top 10 Internet blunders.)
The office's design reflects Facebook's business model too. Openness is fundamental to everything the company does, from generating revenue to its latest plans to weave itself into the fabric of the Web. "Our core belief is that one of the most transformational things in this generation is that there will be more information available," Zuckerberg says. That idea has always been key to Facebook's growth. The company wants to expand the range of information you're sharing and get you to share a lot more of it. (Comment on this story.)
For this to happen, the 1,400 Facebook employees in Palo Alto and around the world (Dublin, Sydney, Tokyo, etc.) work toward two goals. The first is expansion, something the company has gotten prodigiously good at. The site had 117 million unique visitors in the U.S. in March, and the company says some 70% of its users are in other countries. In cellular-connected Japan, the company is focusing on the mobile app. In cricket-crazed India, Facebook snared fans by helping the Indian Premier League build a fan page on Facebook's site. (Watch TIME's video "I Look Like Whom?")
There's a technical aspect too. The slightest fraction of a second in how long it takes to load a Facebook page can make the difference between someone's logging in again or not, so the company keeps shaving down milliseconds to make sure you stay. It also mobilized Facebook users to volunteer to help translate the site into 70 languages, from Afrikaans to Zulu, to make each moment on Facebook feel local.
The Aha! Moment Facebook did not invent social networking, but the company has fine-tuned it into a science. When a newcomer logs in, the experience is designed to generate something Facebook calls the aha! moment. This is an observable emotional connection, gleaned by videotaping the expressions of test users navigating the site for the first time. My mom, a Facebook holdout whose friends finally persuaded her to join last summer, probably had her aha! moment within a few minutes of signing up. Facebook sprang into action. First it asked to look through her e-mail address book to quickly find fellow Facebook users she knew. Then it let her choose which of these people she wanted to start getting short status updates from: Details about what a long-lost friend from high school just cooked for dinner. Photos of a co-worker's new baby. Or of me carousing on a Friday night. (No need to lecture, Mom.)
Facebook has developed a formula for the precise number of aha! moments a user must have before he or she is hooked. Company officials won't say exactly what that magic number is, but everything about the site is geared to reach it as quickly as possible. And if you ever try to leave Facebook, you get what I like to call the aha! moment's nasty sibling, the oh-no! moment, when Facebook tries to guilt-trip you with pictures of your friends who, the site warns, will "miss you" if you deactivate your account. (See what users say about Facebook.)
So far, at least, the site has avoided the digital exoduses that beset its predecessors, MySpace and Friendster. This is partly because Facebook is so good at making itself indispensable. Losing Facebook hurts. In 2008 my original Facebook account was shut down because I had created multiple Dan Fletchers using variants of the same e-mail address, a Facebook no-no but an ingenious way to expand my power in the Mob Wars game on Facebook's site. When Facebook cracked down and gave me and my fictional mafia the kiss of death, I lost all my photos, all my messages and all my status updates from my senior year of high school through the first two years of college. I still miss those digital mementos, and it's both comforting and maddening to know they likely still exist somewhere, sealed off in Facebook's archives.
Being excommunicated from Facebook today would be even more painful. For many people, it's a second home. Users share more than 25 billion pieces of information with Facebook each month. They're adding photos — perhaps the most intimate information Facebook collects — at a rate of nearly 1 billion unique images a week. These pics range from cherished Christmas mornings to nights of partying we, uh, struggle to remember. And we're posting pictures not just of ourselves but also of our friends, and naming, or tagging, them in captions embedded in the images. Not happy someone posted an unflattering shot of you from junior high? Unless the photo is obscene or otherwise violates the site's terms of use, the most you can do is untag your name so people will have a harder time finding the picture (and making fun of you).
With 48 billion unique images, Facebook houses the world's largest photo collection. All that sharing happens on the site. But in two giant leaps, the company has made it so that users can register their opinions on other sites too. That first happened in 2008, when the company released a platform called Facebook Connect. This allows your profile to follow you around the Internet from site to site, acting as a kind of passport for the Web. Want to post a comment about this article on TIME.com? Instead of having to register specifically with that site, Facebook users just have to click one button. This idea of a single sign-on — a profile that obviates the need for multiple user names and passwords — is something a lot of other companies have attempted. But Facebook had the critical mass to make it work. (Become a fan of TIME on Facebook.)
Targeting Your Likes Zuckerberg unveiled the second big initiative, Open Graph, this spring. It's a nerdy name for something that's surprisingly simple: letting other websites place a Facebook Like button next to pieces of content. The idea is to let Facebook users flag the content from as many Web pages as possible. For example, if I'm psyched about Iron Man 2, I can click the Like button for that movie on IMDB, and the film will automatically be filed under Movies on my Facebook profile. I can set my privacy controls so that my friends can find out in one of three ways that this is a movie I like. They can go to IMDB, where my charming profile picture will display on the page. They can get a status update about my liking this movie. Or they can see it on my Facebook profile.
Facebook wants you to get into the habit of clicking the Like button anytime you see it next to a piece of content you enjoy. Less than a month after launching Open Graph — which made its debut with some 30 content partners, including TIME.com — Facebook is quickly approaching the point where it will process 100 million unique clicks of a Like button each day. (See "The Downside of Friends: Facebook's Hacking Problem.")
The company's goal with Open Graph is to give you ways to discover both new content and more common ground with the people you're friends with. That's the social benefit Zuckerberg sees, and it's shared by those in his employ. Sheryl Sandberg, Facebook's chief operating officer, is at her most enthusiastic when she's describing Peace.Facebook.com, part of the website that tracks the number of friendships made each day between members of groups that have historically disagreed, such as Israelis and Palestinians and Sunnis and Shi'ites. "We don't pretend Facebook's this profound all the time," Sandberg says. "But is it harder to shoot at someone who you've connected to personally? Yeah. Is it harder to hate when you've seen pictures of that person's kids? We think the answer is yes."
Helping bring about world peace would be nice, but Facebook is not a philanthropic organization. It's a business, and there's a tremendous business opportunity around Facebook's member data. And Sandberg knows it. She joined the company in 2008 after helping Google build its ad platform into a multibillion-dollar business. Much like Google, Facebook is free to users but makes a lot of money (some analysts estimate the privately held company will generate $1 billion in revenues in 2010) from its robust ad system. According to the Web-research firm comScore, Facebook flashed more than 176 billion banner ads at users in the first three months of this year — more than any other site. (See "Facebook Wants to Read Your Mind.")
The more updates Facebook gets you to share and the more preferences it entreats you to make public, the more data it's able to pool for advertisers. Google spearheaded targeted advertisements, but it knows what you're interested in only on the basis of what you query in its search engine and, if you have a Gmail account, what topics you're e-mailing about. Facebook is amassing a much more well-rounded picture. And having those Like buttons clicked 100 million times a day gives the company 100 million more data points to package and sell.
The result is that advertisers are able to target you on an even more granular level. For example, right now the ads popping up on my Facebook page are for Iron Man 2 games and no-fee apartments in New York City (I'm in a demographic that moves frequently); my mom is getting ads for in-store furniture sales (she's in a demographic that buys sofas).
This advertising platform is even more powerful now that the site can factor in your friends' preferences. If three of your friends click a Like button for, say, Domino's Pizza, you might soon find an ad on your Facebook page that has their names and a suggestion that maybe you should try Domino's too. Peer-pressure advertising! Sandberg and other Facebook execs understand the value of context in selling a product, and few contexts are more powerful than friendship. "Marketers have known this for a really long time. I'm much more likely to do something that's recommended by a friend," Sandberg says.
As powerful as each piece of Facebook's strategy is, the company isn't forcing its users to drink the Kool-Aid. It's just serving up nice cold glasses, and we're gulping it down. The friends, the connections, the likes — those are all produced by us. Facebook is the ultimate enabler. It's enabling us to give it a cornucopia of information about ourselves. It's a brilliant model, and Facebook, through its skill at weaving the site into the fabric of modern life, has made it work better than anyone else.
What Voldemort Is to Harry Potter Zuckerberg believes that most people want to share more about themselves online. He's almost paternalistic in describing the trend. "The way that people think about privacy is changing a bit," he says. "What people want isn't complete privacy. It isn't that they want secrecy. It's that they want control over what they share and what they don't." (See "Who Will Rule the New Internet?")
Unfortunately, Facebook has a shaky history of granting people that control. In November 2007, when the company tried to make its first foray into the broader Web, it rolled out Facebook Beacon, in which users were automatically signed up for a program that sent a notice to all their friends on Facebook if, say, they made a purchase on a third-party site, like movie tickets on Fandango. Initially, users couldn't opt out of the service altogether — they had to click No Thanks with each individual purchase. And, worse, investigations by security analysts found that even after users hit No Thanks, websites sent purchase details back to Facebook, which the company then deleted. Amid a torrent of complaints, Facebook quickly changed Beacon to be an opt-in system, and by December 2007, the company gave users the option of turning off Beacon completely. Ask Zuckerberg and other executives about the program now, and you'll notice that Beacon has become to Facebook what Voldemort is to Harry Potter's world — the thing that shall not be named.
Facebook isn't the only company to have made a serious social-networking infraction. In February, Google apologized after the rollout of its Twitteresque Buzz application briefly revealed whom its users e-mailed and chatted with most, a move that alarmed, among others, political dissidents and cheating spouses. But at Facebook, the Beacon debacle didn't stop the company from pushing to make more information public. This winter, the company changed its privacy controls and made certain profile details public, including a user's name, profile photo, status updates and any college or professional networks. During the transition, Zuckerberg's private photos were briefly visible to all, including several pictures in which he looks, shall we say, overserved. He quickly altered his settings. (See "25 Things I Didn't Want to Know About You on Facebook.")
In April, the site started giving third-party applications more access to user data. Apps like my beloved Mob Wars used to be allowed to keep your data for only 24 hours; now they can store your info indefinitely — unless you uninstall them. This spring, Facebook also launched something called Instant Personalization, which lets a few sites piggyback onto Facebook user data to create recommendation engines. Once again, as with Beacon, users were automatically enrolled.
With each set of changes to Facebook's evolving privacy policy, protest groups form and users spread warnings via status messages. In some cases, these outcries have been quite sizable. Zuckerberg points to 2006, when users protested the launch of Facebook's News Feed, a streaming compilation of your friends' status updates. Without much warning, tidbits that you used to have to seek out by going to an individual's profile page were suddenly being broadcast to everyone on that person's list of friends. "We only had 10 million users at the time, and 1 million were complaining," Zuckerberg says. "Now, to think that there wouldn't be a news feed is insane." He's right — protesting the existence of a news feed seems silly in hindsight; Twitter built its entire site around the news-feed concept. So give Zuckerberg some credit for prescience — and perseverance. "That's a big part of what we do, figuring out what the next things are that everyone wants to do and then bringing them along to get them there," he says. (See "Facebook Does an About-Face on Privacy.")
But corralling 500 million people is a lot harder than corralling 10 million. And some users are ready to pull the plug entirely. Searches for "how to delete Facebook" on Google have nearly doubled in volume since the start of this year.
The Web's Sketchy Big Brother If Facebook wants to keep up the information revolution, then Zuckerberg needs to start talking more and make his case for an era of openness more transparently. Otherwise, Facebook will continue to be cast in the role of the Web's sketchy Big Brother, sucking up our identities into a massive Borg brain to slice, dice and categorize for advertisers.
But amid all the angst, don't forget that we actually like to share. Yes, Facebook is a moneymaking venture. But after you talk to the company's key people, it's tough to doubt that they truly believe that sharing information is better than keeping secrets, that the world will be a better place if you persuade (or perhaps push) people to be more open. "Even with all the progress that we've made, I think we're much closer to the beginning than the end of the trend," Zuckerberg says.
Want to stop that trend? The onus, as always, is on you to pull your information. Starve the beast dead. None of Facebook's vision, be it for fostering peace and harmony or for generating ad revenue, is possible without our feeding in our thoughts and preferences. "The way that people decide whether they want to use something or not is whether they like the product or not," Zuckerberg says. Facebook is hoping that we're hooked. As for me? Time to see if the ex-girlfriend has added new photos.
I joined Facebook (FB) several years ago with simple aims. I wanted new, real-name private sources about Indonesia and Timor-Leste, subjects of two list projects I ran on Yahoo Groups. I was also frustrated about the limited options for private presentation of self (for myself and others) and direct communication on Friendster, then a key social networking player. For a while, FB managed to meet these needs to a partial degree which made it seem worth investment of my time.
This only occurred after a dismaying start, in which my account was quickly suspended, twice, for 'spam-like' activity (simply adding a few friends and posting a few links). FB's then spam detection robot was very amateurish. Finding relevant friends was hampered severely by a then requirement to join a single geographical network -- searching for friends outside was not permitted. (This lame attempt to create 'community' was later eliminated.) I did create several FB 'groups' akin to my Yahoo lists on Indonesia and Timor-Leste. Groups were open to all and adding members was all too easy -- friends on FB longer could simply invite all their friends to join a group. Through friending group members, I was slowly able to build my preferred network on FB, even though, as on Yahoo, few persons besides myself contributed to these 'groups.' On the other hand, I was besieged with requests to join 'fun' game applications.
In retrospect, I should have learned more from these early strange patterns which ultimately turned FB into the deeply flawed, dangerous and specious social networking site it has now unfortunately become.
1) The site owners and admins change the user interface capriciously and too frequently, often without announcement and without first ascertaining, through trials, user feedback. New users cannot possibly master the site even in a month and as a result wind up with settings they would not otherwise approve, even if they have the patience to locate and examine them. The link to the well-written help pages is so poorly placed that few know it exists. In any case, the help pages have grown to almost book-length size, a deterrent to their use.
2) The quality of the main programs which define and maintain the site is too often very poor, too slow, or too inclined to fail. Applications especially lose their original attractive simplicity as they monetize, and they sometimes simply are suddently drastically modified, abandoned or made to disappear. This includes even a major native FB app like the one controlling 'groups.' I had to re-write and adapt text for all the six groups I created and maintained on FB, an onerous task. Redundant and fad apps now are overwhelmingly numerous. It has become very hard to get enough friends to coordinate their apps.
3) On-site search of FB itself now ranges far and wide, revealing much private information of almost everyone on the site. Worse, recent major site revamps reproduce much of this information on the public internet through simple Google searches. All current users should try such external searches to gauge whether what they intended to remain private within a closed community is now public even to persons who are not FB members.
4) In its recent mandatory use of Microsoft's Bing to change words on the personal Info tab of user profiles into clickable links, everything on that tab is now publicly visible on the net. The only way to keep such material private is to write nothing there, or delete what one has already written there. This is a truly egregious violation of the presumption of privacy most people bring to social networking sites. Otherwise put, much of the 'social' part of one's profile must be self-destroyed if privacy is to be preserved. If this is not done and those public links are allowed to stand, one is no longer networking but broadcasting to all the major search engines. Worse, the links automatically generated are almost always repetitive, more often than not inappropriate, and grossly distort the presentation of self on which most social networkers initially focus. The main purpose of this change, expanding by at least a factor of three what is visible to the public from just the Info tab, is mainly to allow FB more room for the paid ads on which it depends. An irony advertisers have likely not yet realized is that personal profiles are generally just briefly scanned by new friends, then forgotten, with such social interactions as do occur originating mainly from the overwhelming News Feed. That in turn turns the site into the breeding ground for social trivia it is today.
5) I left FB with over 2,600 'friends.' This network could in theory be very valuable. But only a small fraction of these 2,600 friends ever read what I post on my Wall. That, for the longest time, was mainly information in the form of links. These postings do not appear in the news feeds of most of my friends. Few FBers post or read mainly substantive content, esp on the order of 5-15 per day from a friend like me. Instead, since they are there mostly to socialize, not to get subtantive information, they make use of a Hide Friend option in their news feeds on the default Home page, so that everything I do never appears in their feeds. A large majority of FBers have too many friends, often in the hundreds and upward, making Hiding Friends almost a necessity to keep one's sanity at the overload of stuff thrown at people while using the site. FB does not help matters by suggesting new friends to add during every new visit to Home. The more friends FBers have, the more opportunities FB has to sell ads and make more money.
FB resembles a good social networking site less and less with each passing day. It has become a money machine. Its socializing has become trivialized, and it is hostile to enough exposure for substantive content. In numerous ways, just a few remarked on here, it has deliberately gradually breached the privacy of all its members' data to the point that by now most of that data is public. The best way to protect yourself, and your friends, from further inevitable FB admin mischief is to delete your account.
Min Liu, a 21-year-old liberal arts student at the New School in New York City, got a Facebook account at 17 and chronicled her college life in detail, from rooftop drinks with friends to dancing at a downtown club. Recently, though, she has had second thoughts.
Concerned about her career prospects, she asked a friend to take down a photograph of her drinking and wearing a tight dress. When the woman overseeing her internship asked to join her Facebook circle, Ms. Liu agreed, but limited access to her Facebook page. “I want people to take me seriously,” she said.
Michael Nagle for The New York Times
Min Liu, thinking about her career, has begun removing personal information from the Web.
The conventional wisdom suggests that everyone under 30 is comfortable revealing every facet of their lives online, from their favorite pizza to most frequent sexual partners. But many members of the tell-all generation are rethinking what it means to live out loud.
While participation in social networks is still strong, a survey released last month by the University of California, Berkeley, found that more than half the young adults questioned had become more concerned about privacy than they were five years ago — mirroring the number of people their parent’s age or older with that worry.
They are more diligent than older adults, however, in trying to protect themselves. In a new study to be released this month, the Pew Internet Project has found that people in their 20s exert more control over their digital reputations than older adults, more vigorously deleting unwanted posts and limiting information about themselves. “Social networking requires vigilance, not only in what you post, but what your friends post about you,” said Mary Madden, a senior research specialist who oversaw the study by Pew, which examines online behavior. “Now you are responsible for everything.”
The erosion of privacy has become a pressing issue among active users of social networks. Last week, Facebook scrambled to fix a security breach that allowed users to see their friends’ supposedly private information, including personal chats.
Sam Jackson, a junior at Yale who started a blog when he was 15 and who has been an intern at Google, said he had learned not to trust any social network to keep his information private. “If I go back and look, there are things four years ago I would not say today,” he said. “I am much more self-censoring. I’ll try to be honest and forthright, but I am conscious now who I am talking to.”
He has learned to live out loud mostly by trial and error and has come up with his own theory: concentric layers of sharing.
His Facebook account, which he has had since 2005, is strictly personal. “I don’t want people to know what my movie rentals are,” he said. “If I am sharing something, I want to know what’s being shared with others.”
Mistrust of the intentions of social sites appears to be pervasive. In its telephone survey of 1,000 people, the Berkeley Center for Law and Technology at the University of California found that 88 percent of the 18- to 24-year-olds it surveyed last July said there should be a law that requires Web sites to delete stored information. And 62 percent said they wanted a law that gave people the right to know everything a Web site knows about them.
That mistrust is translating into action. In the Pew study, to be released shortly, researchers interviewed 2,253 adults late last summer and found that people ages 18 to 29 were more apt to monitor privacy settings than older adults are, and they more often delete comments or remove their names from photos so they cannot be identified. Younger teenagers were not included in these studies, and they may not have the same privacy concerns. But anecdotal evidence suggests that many of them have not had enough experience to understand the downside to oversharing.
Elliot Schrage, who oversees Facebook’s global communications and public policy strategy, said it was a good thing that young people are thinking about what they put online. “We are not forcing anyone to use it,” he said of Facebook. But at the same time, companies like Facebook have a financial incentive to get friends to share as much as possible. That’s because the more personal the information that Facebook collects, the more valuable the site is to advertisers, who can mine it to serve up more targeted ads.
Two weeks ago, Senator Charles E. Schumer, Democrat of New York, petitioned the Federal Trade Commission to review the privacy policies of social networks to make sure consumers are not being deliberately confused or misled. The action was sparked by a recent change to Facebook’s settings that forced its more than 400 million users to choose to “opt out” of sharing private information with third-party Web sites instead of “opt in,” a move which confounded many of them.
Mr. Schrage of Facebook said, “We try diligently to get people to understand the changes.”
But in many cases, young adults are teaching one another about privacy.
Ms. Liu is not just policing her own behavior, but her sister’s, too. Ms. Liu sent a text message to her 17-year-old sibling warning her to take down a photo of a guy sitting on her sister’s lap. Why? Her sister wants to audition for “Glee” and Ms. Liu didn’t want the show’s producers to see it. Besides, what if her sister became a celebrity? “It conjures up an image where if you became famous anyone could pull up a picture and send it to TMZ,” Ms. Liu said.
Andrew Klemperer, a 20-year-old at Georgetown University, said it was a classmate who warned him about the implications of the recent Facebook change — through a status update on (where else?) Facebook. Now he is more diligent in monitoring privacy settings and apt to warn others, too.
Helen Nissenbaum, a professor of culture, media and communication at New York University and author of “Privacy in Context,” a book about information sharing in the digital age, said teenagers were naturally protective of their privacy as they navigate the path to adulthood, and the frequency with which companies change privacy rules has taught them to be wary.
That was the experience of Kanupriya Tewari, a 19-year-old pre-med student at Tufts University. Recently she sought to limit the information a friend could see on Facebook but found the process cumbersome. “I spent like an hour trying to figure out how to limit my profile, and I couldn’t,” she said. She gave up because she had chemistry homework to do, but vowed to figure it out after finals.
“I don’t think they would look out for me,” she said. “I have to look out for me.”
It’s kind of crazy. I’ve been playing with Facebook’s “Posts By Everyone” search feature recently, and many people who hide their profile information have no problem sharing sometimes really personal updates with the world. Are there people on Facebook who don’t understand how to keep their updates out of the public eye? And why doesn’t Facebook allow for mobile updates to be private?
Searching Everyone’s Updates
You might have missed Facebook’s “Posts By Everyone” feature. It’s easy to overlook because search results aren’t shown by default. Consider this search for hungover:
When you start typing, Facebook suggests some options right within the search box. Pick any of those, and you go directly to a person, page or application, rather than overall search results. It’s easy to do this by hitting enter, so that you never get the search results at all.
If you go to the very bottom, there’s a “More Results” option as highlighted above. Click that, and a broader set of results appears:
Notice on the left-hand side of the results, there are options to get results back from all these categories:
All Results
People
Pages
Groups
Applications
Events
Web Results
Posts By Friends
Posts By Everyone
In the search results above, you can see that “All Results” is highlighted, so I should be getting back results from all these categories. However, that’s not what happens. Instead, Facebook only brings back results from matching Pages, Posts By Friends and Web Results. That’s it.
(This, by the way, is just one example of why I often joke to people who warn that Facebook will beat Google in search that Facebook has enough problems searching Facebook itself, much less the entire web.)
Now look what happens if I drill in to the “Posts By Everyone” category:
Suddenly I see what Facebook failed to show me before, all the people on Facebook telling the world about their hangovers.
Do these people all mean to share this way? Well, it’s not like people on Twitter don’t share about having hangovers:
They key difference between Facebook and Twitter is that at Twitter, by default you’re sharing with the world. At Facebook, the default for updates is to share only with your friends.
In other words, post to Twitter, and most people probably realize they’re telling something to the world. Post at Facebook, and many people might think they’re only sharing with their friends.
Facebook’s Warnings About Sharing To The World
Indeed, Facebook deserves credit in really making you jump through hoops before you can share an update to the world. For example, here’s what you get in a brand new account, before you’ve ever even posted something:
That links over to a privacy FAQ page, and the only way the message disappears is if you manually click to close it. If you don’t close it, the message reappears each time you come back to the status area.
Beyond that, if you make an update and change from the default “Only Friends” option:
To the “Everyone” option, you get another warning:
After doing a post to everyone, your default remains stuck on “Only Friends.” Facebook doesn’t shift it to “Everyone,” something it could do if it wanted to try and get people to be more public about what they’re sharing, something many people — including myself — suspect them of wanting to do.
You Hide Your Profile, But Not Your Updates?
So why would I think some people don’t understand the Facebook privacy settings, when it comes to updates, especially when they have so many hoops to jump through?
Consider a search for hate my boss. I’m not going to put up a screenshot, because I don’t want to immortalize anyone and get them in trouble. But do that search, and you get posts like:
hate my job. hate my boss.
i hate my job. talked to my managers and boss didn’t help and made it worse
Do people saying these things realize that their bosses might also see the updates? To test, I went to the profiles of 10 people who each appeared in that “hate my boss” search. Here’s what I saw for all 10 of them (I’ve blanked out the name for the example shown):
The message tells me that this person is sharing only some of his info with everyone, right? And yet, I can see their updates. In fact, if I select the “Wall” tab, I see all their updates nicely displayed. If someone’s boss found them by name on Facebook — which isn’t hard to do — they could do the same.
Why would all these people who keep their profiles locked down still share updates? One issue might be that by default, Facebook displays the “this person shares only some things” message to anyone who isn’t someone’s friend, because chances are everyone has some tiny bit of information that by default isn’t shared on Facebook.
Facebook’s Mobile Free-For-All
Another reason is mobile. I fired up the Facebook application for the iPhone. There’s a big “What’s on your mind” box that appears at the top. Enter something, like “I hate my boss,” and that message goes to your Wall — and to the world.
Unlike Facebook itself, there are no privacy settings that I can find in the application, no share with “Only Friends” choice. If you share via the iPhone — and perhaps other mobile devices — you share with the world. That’s also true if you use Facebook’s mobile site on the web. There’s no option there other than to share with the world.
Going back to those 10 people I reviewed? I can also see that 6 of them in the search results are tagged as sharing “via the Mobile Web.” In contrast, for 10 people I looked at who said hate my boss on Twitter, only one seemed to do it via mobile.
Maybe some of those people on Facebook didn’t mean for their updates to go public. Or, maybe they’re just stupid or don’t care. I can’t fault Facebook for how it handles things on its full web site, in terms of highlighting privacy issues with updates. On the mobile front, they look to be screwing up big time.
Advice For The Concerned
By the way, as Facebook’s privacy issues ramp up, I read about more and more people wondering if they should cancel their Facebook accounts. I went through a similar struggle last December (see Now Is It Facebook’s Microsoft Moment?). As a marketer, I ultimately decided I still needed to be on the Facebook platform. But I also shifted to primarily sharing information through my fan page, where everything is public, by default.
I highly recommend fan pages to anyone. It may be a way for you to feel you have more control on Facebook at a time when it’s difficult to understand what Facebook is likely to change next. Don’t be put off on the weirdness of having a “fan” page. Just think of it as a way to have a place on Facebook where you know everything is public, a constant reminder that what you say is being said to the world overtly — rather than a constant fear that what you say or do might get shared to the world without you realizing that.
Alternatively, just assume that all you do on Facebook is public, that there is no privacy. Make that assumption, and you’ll be relatively safe — assuming that apps don’t start tracking all your web surfing habits and reporting back to the Facebook mothership or the world. To be really safe, always log out of Facebook.
Advice For Facebook
To Facebook, my advice is more blunt. Get your shit together. Enough explanations that the web is more comfortable being public or everyone has “granular” privacy controls and other platitudes. Each day, there seems to be some worry — just do a search for Facebook on Techmeme for a summary.
Someone over there, anyone — stand up and scream that your company is screwing up big time on the privacy front. You keep getting away with it so far, but that might not continue.
Cameras that can follow you from the minute you enter a store to the moment you hit the checkout counter, recording every T-shirt you touch, every mannequin you ogle, every time you blow your nose or stop to tie your shoelaces.
•
Web coupons embedded with bar codes that can identify, and alert retailers to, the search terms you used to find them and, in some cases, even your Facebook information and your name.
•
Mobile marketers that can find you near a store clothing rack, and send ads to your cellphone based on your past preferences and behavior.
To be sure, such retail innovations help companies identify their most profitable client segments, better predict the deals shoppers will pursue, fine-tune customer service down to a person and foster brand loyalty. (My colleagues Stephanie Rosenbloom and Stephanie Clifford have written in detail about the tracking prowess of store cameras and Web coupons.)
But these and other surveillance techniques are also reminders that advances in data collection are far outpacing personal data protection.
Enter the post-privacy society, where we have lost track of how many entities are tracking us. Not to mention what they are doing with our personal information, how they are storing it, whom they might be selling our dossiers to and, yes, how much money they are making from them.
On the way out, consumer advocates say, is that quaint old notion of informed consent, in which a company clearly notifies you of its policies and gives you the choice of whether to opt in (rather than having you opt out once you discover your behavior is being tracked).
“How does notice and choice work when you don’t even interface with the company that has your data?” says Jessica Rich, a deputy director of the bureau of consumer protection at the Federal Trade Commission.
The commission has brought several dozen complaints against companies about possibly deceptive or unfair data collection and nearly 30 complaints over data security issues. In 2009, the commission proposed new guidelines for Web advertising that is tailored to user behavior.
The problem is, the F.T.C.’s guidelines are merely recommendations. Corporations can choose to follow them — or not. And the online advertising standards don’t apply to off-line techniques like observation in stores.
Mike Zaneis, vice president for public policy at the Interactive Advertising Bureau, a trade association based in Manhattan, says the advertising industry is not generally collecting personally identifiable data.
His group has worked closely with the F.T.C. on industry self-regulation, he says, and is developing new industry standards to alert consumers as they encounter ads based on their online behavior.
In the meantime, Mr. Zaneis says, consumers can use an industry program if they want to opt out of some behavior-based ads. As for mobile marketing, he says, consumers are always asked if they want to opt in to ads related to their cellphone location.
The larger issue here is not the invasion of any one person’s privacy as much as the explosive growth of a collective industry in behavioral information, says Jeff Chester, the executive director of the Center for Digital Democracy, a nonprofit group that works to safeguard user privacy.
“The whole business model is unfettered data collection of all your activities online and off,” Mr. Chester says. For example, he says that when consumers opt into cellphone ads, they may not understand that marketers may link their locations with information from third-party databases. The result, he says, is mobile dossiers about individual consumers.
As contradictory as it might sound, we need new strategies for transparent consumer surveillance.
In a country where we have a comprehensive federal law — the Fair Credit Reporting Act — giving us the right to obtain and correct financial data collected about us, no general federal statute requires behavioral data marketers to show us our files, says Ms. Rich of the F.T.C.
So, is the European model, involving independent government agencies called Data Protection Commissions that are charged with safeguarding people’s personal information, better than ours?
Europe’s privacy commissioners have generally been more forward-looking, examining potential privacy intrusions like biometric tracking, while the F.T.C. is still trying to understand the magnitude and the implications of the Web, says Marc Rotenberg, the executive director of the Electronic Privacy Information Center, a research group in Washington.
“The U.S. system with regard to privacy is not working,” Mr. Rotenberg says.
By early fall, the F.T.C. plans to propose comprehensive new privacy guidelines intended to provide greater tools for transparency and better consumer control of personal information, Ms. Rich says.
In the meantime, what if consumers take a more active interest in who is collecting information about them?
In a recent documentary called “Erasing David,” the London-based filmmaker David Bond attempts to disappear from Britain’s surveillance grid, hiring experts from the security firm Cerberus to track him using all the information they can glean about him while he tries to outrun them. In the course of the film, the detectives even obtain a copy of the birth certificate of his daughter, then 18 months old.
But the real shocker is the information Mr. Bond is able to obtain about himself — by taking advantage of a data protection law in Britain that requires public agencies and private businesses to release a person’s data file upon his or her written request.
In one scene, Mr. Bond receives a phonebook-thick printout from Amazon.com listing everything he ever bought on the site; the addresses of every person to whom he ever sent a gift; and even the products he perused but did not ultimately buy.
He also receives a file from his bank, including a transcript of an irate phone call he once made after the bank lost one of his checks. The transcript noted that he seemed angry and raised his voice.
“It read like a mini-Stasi file,” Mr. Bond said when I called him last week. When recorded messages inform us that we may be taped “for training or quality assurance purposes,” he reminded me, we should remember that our conversation may end up in our dossiers.
INSPIRED by Mr. Bond’s odyssey, I called some companies with whom I do business.
A customer service representative at a bookstore chain where I have a discount card told me that the company maintains a list of the amount each member spends on each transaction so that the store can tell people how much money they saved at the end of the year. But a loyalty cardholder is not permitted to obtain his or her own purchase history.
Then I called an online travel agency and asked if I could get copies of my flight history and phone transcripts. I was regretting a disgruntled call I made to the agency a few months ago after being stranded at an airport in a blizzard. The customer care rep said clients couldn’t obtain their own transcripts unless it was for legal purposes.
Was I being taped this time, too? They always tape, he said.
Yet people often dole out all kinds of personal information on the Internet that allows such identifying data to be deduced. Services like Facebook, Twitter and Flickr are oceans of personal minutiae — birthday greetings sent and received, school and work gossip, photos of family vacations, and movies watched.
Computer scientists and policy experts say that such seemingly innocuous bits of self-revelation can increasingly be collected and reassembled by computers to help create a picture of a person’s identity, sometimes down to the Social Security number.
“Technology has rendered the conventional definition of personally identifiable information obsolete,” said Maneesha Mithal, associate director of the Federal Trade Commission’s privacy division. “You can find out who an individual is without it.”
In a class project at the Massachusetts Institute of Technology that received some attention last year, Carter Jernigan and Behram Mistree analyzed more than 4,000 Facebook profiles of students, including links to friends who said they were gay. The pair was able to predict, with 78 percent accuracy, whether a profile belonged to a gay male.
So far, this type of powerful data mining, which relies on sophisticated statistical correlations, is mostly in the realm of university researchers, not identity thieves and marketers.
But the F.T.C. is worried that rules to protect privacy have not kept up with technology. The agency is convening on Wednesday the third of three workshops on the issue.
Its concerns are hardly far-fetched. Last fall, Netflix awarded $1 million to a team of statisticians and computer scientists who won a three-year contest to analyze the movie rental history of 500,000 subscribers and improve the predictive accuracy of Netflix’s recommendation software by at least 10 percent.
On Friday, Netflix said that it was shelving plans for a second contest — bowing to privacy concerns raised by the F.T.C. and a private litigant. In 2008, a pair of researchers at the University of Texasshowed that the customer data released for that first contest, despite being stripped of names and other direct identifying information, could often be “de-anonymized” by statistically analyzing an individual’s distinctive pattern of movie ratings and recommendations.
In social networks, people can increase their defenses against identification by adopting tight privacy controls on information in personal profiles. Yet an individual’s actions, researchers say, are rarely enough to protect privacy in the interconnected world of the Internet.
You may not disclose personal information, but your online friends and colleagues may do it for you, referring to your school or employer, gender, location and interests. Patterns of social communication, researchers say, are revealing.
“Personal privacy is no longer an individual thing,” said Harold Abelson, the computer science professor at M.I.T. “In today’s online world, what your mother told you is true, only more so: people really can judge you by your friends.”
Collected together, the pool of information about each individual can form a distinctive “social signature,” researchers say.
The power of computers to identify people from social patterns alone was demonstrated last year in a study by the same pair of researchers that cracked Netflix’s anonymous database: Vitaly Shmatikov, an associate professor of computer science at the University of Texas, and Arvind Narayanan, now a researcher at Stanford University.
By examining correlations between various online accounts, the scientists showed that they could identify more than 30 percent of the users of both Twitter, the microblogging service, and Flickr, an online photo-sharing service, even though the accounts had been stripped of identifying information like account names and e-mail addresses.
“When you link these large data sets together, a small slice of our behavior and the structure of our social networks can be identifying,” Mr. Shmatikov said.
Even more unnerving to privacy advocates is the work of two researchers from Carnegie Mellon University. In a paper published last year, Alessandro Acquisti and Ralph Gross reported that they could accurately predict the full, nine-digit Social Security numbers for 8.5 percent of the people born in the United States between 1989 and 2003 — nearly five million individuals.
Social Security numbers are prized by identity thieves because they are used both as identifiers and to authenticate banking, credit card and other transactions.
The Carnegie Mellon researchers used publicly available information from many sources, including profiles on social networks, to narrow their search for two pieces of data crucial to identifying people — birthdates and city or state of birth.
That helped them figure out the first three digits of each Social Security number, which the government had assigned by location. The remaining six digits had been assigned through methods the government didn’t disclose, although they were related to when the person applied for the number. The researchers used projections about those applications as well as other public data, like the Social Security numbers of dead people, and then ran repeated cycles of statistical correlation and inference to partly re-engineer the government’s number-assignment system.
To be sure, the work by Mr. Acquisti and Mr. Gross suggests a potential, not actual, risk. But unpublished research by them explores how criminals could use similar techniques for large-scale identity-theft schemes.
More generally, privacy advocates worry that the new frontiers of data collection, brokering and mining, are largely unregulated. They fear “online redlining,” where products and services are offered to some consumers and not others based on statistical inferences and predictions about individuals and their behavior.
The F.T.C. and Congress are weighing steps like tighter industry requirements and the creation of a “do not track” list, similar to the federal “do not call” list, to stop online monitoring.
But Jon Kleinberg, a professor of computer science at Cornell University who studies social networks, is skeptical that rules will have much impact. His advice: “When you’re doing stuff online, you should behave as if you’re doing it in public — because increasingly, it is.”