In case you missed it, check out the story on Google's gatekeepers in yesterday's Sunday Times magazine. The piece, by Jeffrey Rosen, looks at when and why Google elects to block things, specifically on YouTube.
It touches on examples from foreign countries and from this one—a Michelle Malkin video being yanked from YouTube without good apparent reason. As I said last week while writing on this subject, I like Google, but it still troubles me, especially for the fact of a small bunch of Google employees having the power to decide what gets yanked and what doesn't from YouTube.
From the Times:
"To love Google, you have to be a little bit of a monarchist, you have to have faith in the way people traditionally felt about the king," Tim Wu, a Columbia law professor and a former scholar in residence at Google, told me recently. "One reason they're good at the moment is they live and die on trust, and as soon as you lose trust in Google, it's over for them." Google's claim on our trust is a fragile thing. After all, it's hard to be a company whose mission is to give people all the information they want and to insist at the same time on deciding what information they get.
I love that analogy. But like I said last week, what gives me pause about Google isn't that it's a nefarious company but rather that it is a powerful business which is run by very fallible humans. Maybe the current group of Google leadership has all of our interests at heart, but that doesn't mean that the next will.
"Right now, we're trusting Google because it's good, but of course, we run the risk that the day will come when Google goes bad," Wu told me. In his view, that day might come when Google allowed its automated Web crawlers, or search bots, to be used for law-enforcement and national-security purposes. "Under pressure to fight terrorism or to pacify repressive governments, Google could track everything we've searched for, everything we're writing on gmail, everything we're writing on Google docs, to figure out who we are and what we do," he said. "It would make the Internet a much scarier place for free expression." The question of free speech online isn't just about what a company like Google lets us read or see; it's also about what it does with what we write, search and view.
WU'S FEARS THAT violations of privacy could chill free speech are grounded in recent history: in China in 2004, Yahoo turned over to the Chinese government important account information connected to the e-mail address of Shi Tao, a Chinese dissident who was imprisoned as a result. Yahoo has since come to realize that the best way of resisting subpoenas from repressive governments is to ensure that private data can't be turned over, even if a government demands it. In some countries, I was told by Michael Samway, who heads Yahoo's human rights efforts, Yahoo is now able to store communications data and search queries offshore and limits access of local employees, so Yahoo can't be forced to turn over this information even if it is ordered to do so.
Isolating, or better still, purging data is the best way of protecting privacy and free expression in the Internet age: it's the only way of guaranteeing that government officials can't force companies like Google and Yahoo to turn over information that allows individuals to be identified. Google, which refused to discuss its data-purging policies on the record, has raised the suspicion of advocacy groups like Privacy International. Google announced in September that it would anonymize all the I.P. addresses on its server logs after nine months. Until that time, however, it will continue to store a wealth of personal information about our search results and viewing habits — in part to improve its targeted advertising and therefore its profits. As Wu suggests, it would be a catastrophe for privacy and free speech if this information fell into the wrong hands.
"The idea that the user is sovereign has transformed the meaning of free speech," Wu said enthusiastically about the Internet age. But Google is not just a neutral platform for sovereign users; it is also a company in the advertising and media business. In the future, Wu said, it might slant its search results to favor its own media applications or to bury its competitors. If Google allowed its search results to be biased for economic reasons, it would transform the way we think about Google as a neutral free-speech tool. The only editor is supposed to be a neutral algorithm. But that would make it all the more insidious if the search algorithm were to become biased.
Like I said, the whole piece is worth a read.