Mar 30, 2012 (05:03 AM EDT)
Google Big Tent: Regulation Vs. Personal Responsibility

Read the Original Article at InformationWeek

Google gathered a group of technologists, academics, and artists Thursday to discuss how individuals can improve their Internet literacy and become more active digital citizens.

The event, called Google Big Tent, grew out of the company's European Zeitgeist event last year, and has become a separate periodic gathering focused on digital policy and censorship.

David Drummond, Google's SVP and chief legal counsel, characterized this first Big Tent in the United States as a platform for the discussion of issues facing the Internet community. Noting the economic and social benefits of the Internet, he also acknowledged the Internet's dark side.

"Technology has not magically solved our pressing social problems," he said.

Yet in grappling with these problems and they way they're manifested online, Drummond said that debates about remedies quickly devolve into discussions of regulation or ways to constrain private companies.

[ Read YouTube Tool Blurs Faces To Protect Privacy. ]

Drummond, as Google's top legal representative, often has a ringside seat at such battles. Much of the Federal Trade Commission's interest in consumer privacy stems from the boundary-pushing--some would argue rule-breaking--business practices of Google and competitors like Facebook. Google is also facing antitrust scrutiny in the United States and Europe. The prospect of future regulation that will affect Google and its peers is very real.

In that context, it's hardly surprising that Google might want to focus on how individuals can take greater responsibility for their online experience. If Internet users, for example, learn how to protect their own privacy, governments may not ask as much of Google.

Drawing a contrast between the drumbeat for regulation and the absence of advocacy for end-user responsibility, Drummond explained, "It's this attention deficit to the questions of digital citizenship and digital literacy that we hope to address today."

At a time when the White House is talking about a Consumer Privacy Bill of Rights, Google wants to talk about consumer digital responsibility. Someone's got to do it.

What might that look like? Fourteen-year-old Adora Svitak, author of no less than three books and advocate of techno-youth empowerment, rejected the academic nanny culture that bans before it looks. "I prefer the 'touch the stove' approach," she said during her time on stage.

Touching a hot stove was, she recalled, something her mother allowed her to do in her youth--not long ago--to educate her. She recounted it as a slightly painful experience that taught her more effectively than a more risk-averse approach might have. Perhaps her fourth book will be called Technology for the Tiger Mother, if there's room for two in Amy Chua's school of hard knocks.

But Svitak's endorsement of personal responsibility online has an undeniable logic, given that, despite lawmakers' threats to tame the Internet, there isn't really an alternative at the moment.

"Facebook doesn't give you a popup [when you post] that says, 'Are you going to regret this later?'" she said. "For now, we really have to be our own privacy filters."

If the youth/tech advocacy gig doesn't work out, Svitak could become a spokesperson for an ad industry trade group without missing a beat.

Representatives from YouTube, Twitter, and the Pew Research Center followed Svitak and provided some insight into the difficulty of identifying harmful content, the needles in the haystack of tweets, videos, and other user-supplied content.

As an example, it has become increasingly common for young girls to use YouTube to upload "Am I Pretty?" videos. This trending genre involves a teen or tween video maker who seeks votes of approval for her looks from the Internet community and is invariably exposed to abuse from one or more of the Internet's seemingly infinite supply of jerks.

YouTube, of course, hears from parents and those concerned about the vulnerable young that it should do more to protect users from others and from themselves.

"The question we face is, 'should a private company take away the right of a girl to ask if I am pretty?'" asked Victoria Grand, YouTube director of global communications and policy.

For some, the answer is obvious, for others, not so much. And for every issue that seems easy to solve, there's another more vexing conundrum awaiting those trying to keep the world safe from the rest of the world.

There are videos of people cutting themselves, except most are warnings to not actually do so. There are videos of people inhaling cinnamon and recording their gasping, to the chagrin of medical professionals. There are tweets that may be insulting, hate-speech, or just banter among friends.

"We have a very active 'My Little Pony' role-playing community," explained Twitter trust & safety director Del Harvey, in an apparent attempt to highlight the ambiguity of tweet policing.

It's not easy figuring out whether the brains strewn on a street in a YouTube video represent hard-hitting journalism from Syria or a prank pulled by someone who purchased organ meat from a butcher shop.

"What do you do with man boobs?" Grand asked.

What indeed. It's a question, though, that does make the push for regulation easier to understand.

The Enterprise 2.0 Conference brings together industry thought leaders to explore the latest innovations in enterprise social software, analytics, and big data tools and technologies. Learn how your business can harness these tools to improve internal business processes and create operational efficiencies. It happens in Boston, June 18-21. Register today!