January 2012


Just a quick follow up on the discussion of SOPA; people keep asking me what kind of legislation would be more appropriate than SOPA and PIPA, and that might have a better chance of gaining the support of the technology industries, users, and Congress. I’m not in the business of writing laws, but as a start, my sense of it is that there are two kinds of infringement: first, there are underground sites and networks dedicated to trading copyrighted music, software, games, and movies; they are determined to elude regulations, they often move offshore or spread their resources across national jurisdictions to make prosecution harder, and they are technologically sophisticated enough to work just with numerical IP addresses, set up mirror sites, and move when one site gets shut down. The second kind of infringement is when some fan, who may not know or appreciate the rules of copyright, uploads a clip to YouTube.

The mistake the entertainment industry continues to make is that they want to stop both kinds of piracy, and they seem unwilling to admit that they are different and start dealing with them as separate problems, with different tools, and with a different “threat level” in their rhetoric. SOPA was problematic in so may respects, but in particular because it tried to address both kinds of piracy at once, and failed to handle either appropriately. The kinds of measures it was suggesting for “rogue, foreign websites” (let’s assume they meant the hardcore piracy networks) wouldn’t be enough: if the DoJ got a court order to remove these sites from Google’s search and the major ISPs, you or I might not be able to access these sites. But determined file traders don’t find them through Google. And SOPA got so much blowback because it also tried to include the second kind of piracy at the same time – which, in fact, is handled relatively well with the “notice-and-takedown” rules that already apply to content platforms like YouTube.

It’s not only that these two kinds of piracy are so different that they require distinctly different approaches: it’s that the entertainment industry needs to let go of trying to squelch them both in the same breath. If they could start distinguishing the two, and make clear that they don’t want to catch up YouTube and Facebook in their net in the process, I think the technology industries will be more willing to develop and uphold gentle norms and procedures for the kinds of infringement that may happen on their networks.

[Cross posted on Culture Digitally and MSR Social Media Collective]

Since I supported the blacking out of the MSR Social Media Collective blog to which I sometimes contribute, and the blacking out of Culture Digitally, which I co-organize, in order to join the SOPA protest led by the “Stop American Censorship” effort, the Electronic Frontier Foundation, Reddit, and Wikipedia, I though I should weigh in with my own concerns about the proposed legislation. 

While it’s reasonable for Congress to look for progressive, legislative ways to enforce copyrights and discourage flagrant piracy, SOPA (the Stop Online Piracy Act) and PIPA (the Protect IP Act) now under consideration are a fundamentally dangerous way to go about it. Their many critics have raised many compelling reasons for why [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16]. But in my eyes, they are most dangerous because of their underlying logic: policing infringement by rendering sites invisible.

Under SOPA and PIPA, if a website is even accused of hosting or enabling infringing materials, the Attorney General can order search engines to delete that site from their listings, require ISPs to block users’  access to it, and demand payment services (like PayPal) and advertising networks to cancel their accounts with it. (This last step can even be taken by copyright holders themselves, with only a good faith assertion that the site in question is infringing.) What a tempting approach to policing the Internet: rather than pursuing and prosecuting this site and that site, in an endless game of whack-a-mole, just turn the large-scale intermediaries, and use their power to make websites available, in order to make them unavailable. It shows all too plainly that the Internet is not some wide open, decentralized, unregulatable space, as some have believed. But, it undercuts the longstanding American tradition of how to govern information, which has always erred on the side of letting information, even abhorrent or criminal information, be accessible to citizens, so we can judge for ourselves. Making it illegal to post something is one thing, but wiping the entire site clean off the board as if it never existed is another.

Expunging an infringing site from being found is problematic in itself, a clear form of “prior restraint.” But it is exacerbated by the fact that whole sites might be rendered invisible on the basis of just bits of infringing content they may host. This is a particular troubling to sites that host user-generated content, where one infringing thread, post, or community might co-exist amidst a trove of other legitimate content. Under SOPA and PIPA, a court order  could remove not just the offending thread, but the entire site from Google’s search engine, from ISPs, and from ad networks, all in a blink.

These are the same strategies, not only that China, Iran, and Vietnam currently use to restrict political speech (as prominent critics have charged), but that were recently used against Wikileaks right here at home. When Amazon kicked Wikileaks off its cloud computing servers, when Wikileaks was de-listed by one DNS operator, when Mastercard and Paypal refused to take donations for the organization, they were attempting to render Wikileaks invisible before a court ever determined, or even alleged, that Wikileaks had broken any laws. So it is not a hypothetical that this tactic of rendering invisible will not only be dangerous for commercial speech, or the expressive rights of individual users, but for vital, contested, political speech. SOPA and PIPA would simply organize these tactics into a concerted, legally-enforced effort to erase, to which all search engines and ISP would be obligated to impose.

A lighthearted aside: In the film Office Space, the soulless software company chose not to fire the hapless Milton. Instead, they took away his precious stapler, moved him to the basement, and simply stopped sending him paychecks. We laughed at the blank-faced cruelty, because we recognized how tempting this solution would be, a deft way to avoid having to someone to their face. Congress is considering the same “Bobs” strategy here. But while it may be fine for comedy, this is hardly the way to address complex legal challenges around the distribution of information that should be dealt with in the clear light of a court room. And it risks rendering invisible elements of the web that might deserve to remain.

We are at a point of temptation. The Internet is both so powerful and so unruly because anyone can add their site to it (be it noble or criminal, informative or infringing) and it will be found. It depends on, and presumes, a principle of visibility. Post the content, and it is available. Request it, from anywhere in the world, and the DNS servers will find it. Search for it in Google, and it will appear. But, as those who find this network most threatening come calling, with legitimate (at least in the abstract) calls to protect children / revenue / secrets / civility, we will be sorely tempted to address these challenges simply by wiping them clean off the network.

This is why the response to SOPA and PIPA, most prominently in the January 18 blackouts by Reddit, Wikipedia, and countless blogs, are so important. Removing their content, even for a day, is meant to show how dangerous this forced invisibility could be. It should come as no surprise that, while many other Internet companies have voiced their concerns about SOPA, it is Wikipedia and Reddit that have gone the farthest in challenging the law. Not only do they host, i.e. make visible, an enormous amount of user-generated content. But they are themselves governed in important ways by their users. Their decisions to support a blackout were themselves networked affairs, that benefited from all of their users having an ability to participate — and recognized that commitment to openness as part of their fundamental mission.

Whether you care about the longstanding U.S. legal tradition of information freedoms, or the newly emergent structural logic of the Internet as a robust space of public expression, both require a new and firm commitment in our laws: to ensure that the Internet remains navigable, that sites remain visible, that pointers point and search engines list, regardless of the content. Sites hosting or benefitting from illegal or infringing content should be addressed directly by courts and law enforcement, armed with a legal scalpel that’s delicate enough to avoid carving off huge swaths of legitimate expression. We might be able to build a coalition of content providers and technology companies willing to partner on anti-piracy legislation, if copyright holders could admit that they need to go after the determined, underground piracy networks bent on evading regulation, and not in the same gesture put YouTube at risk for a video of a kid dancing to a Prince tune — there is a whole lot of middle ground there. But a policy premised on rendering parts of the web invisible is not going to accomplish that. And embracing this strategy of forced invisibility is too damaging to what the Internet is and could be as a public resource.

(Cross-posted at Culture Digitally and MSR’s Social Media Collective.)

Moveon.org began circulating this infographic yesterday; The (much more detailed) original is from OWNI.eu. It tells a now-familiar-but-still-important story about the increasing consolidation of commercial media (and by implication, a concern about its impact on public discourse). Despite the times, the attention here is not on online media or new forms of information distribution, though that attention would shift the image only slightly — where might we add Hulu as a “notable property”… under News Corp, GE, and Disney? We might also have to add Google, Apple, and Facebook. But would that change the basic concern? Do they shift the “staggering” percentage listed at the top? And what does “control” mean when we talk about not just content providers but distributors, platforms, and networks as well?

(Cross-posted at Hacktivision and Culture Digitally)

Benjamin Franklin, “Apology for Printers” (1731)

I’m going back to read some scholarship on joirnalistic objectivity; this quote was cited in Michael Schudson’s essay “The Objectivity Norm in American Journalism.” This is the best articulation I’ve come across of the idea of the “marketplace of ideas” and, with it, the call for editorial neutrality. Unfortunately, I can only agree with the first half of the statement. Still, well said.

I was interviewed by the NPR program To The Best Of Our Knowledge, for a program on trends. It just went up, if you want to take a listen: “What’s Hot and Why Not?” Mine is the first segment. Also pretty cool that they paired me with Grant McCracken, Butch Vig, and Dr Seuss! This was an extension of my Culture Digitally blog post that first addressed Twitter Trends: “Can an algorithm be wrong?”. (It was also cross-posted here and on Microsoft’s Research’s “Social Media Collective” blog, and was reprinted by Salon.com). I also just finished a piece for Limn that pushes on these concerns a bit more.