The New Guidelines for User-Generated Content Services such as MySpace: Why Some Will Predictably Inhibit "Fair Use"
By JULIE HILDEN
|Monday, Nov. 12, 2007|
Recently, a number of content producers (CBS, Disney, Fox, NBC Universal, and Viacom) and a number of websites hosting user-generated content (Daily Motion, MySpace, and Veoh) -- voluntarily agreed among themselves to abide by a set of principles governing user-generated content (UGC). Microsoft, too, has signed on.
Some of these principles are plainly correct and in everyone's (including users') interest, as I will explain. However, others will predictably end up curtailing the amount of "fair use" of copyrighted material that occurs on UGC sites, and thus inhibiting freedom of speech and artistic freedom. (For more on "fair use," see my column from May 16). Thus, while these latter principles may still arguably be the best way to police infringement, it is important to note that their effectiveness comes at a potentially high price.
Put another way, if users had had a formal seat at the negotiation table, the Guidelines might have tilted much more strongly toward "fair use." The content producers' interest was to protect copyright. The UGC sites had the mixed interest of avoiding lawsuits for vicarious and contributory copyright infringement (theories I discussed in a previous column), and also pleasing users. Only users, however, had a direct, unqualified interest in ensuring that they could make "fair use" of copyrighted material in uploading their work to UGC services.
The Part of the Guidelines that Should Be Uncontroversial
Let's start with that part of the Guidelines that is, plain and simple, a set of good ideas that actually will benefit everyone, including users.
It also makes a great deal of sense for the Guidelines to ask UGC services to continually update the software that they use to find potentially copyright-infringing uploads, as the relevant technology advances and improves. Relying solely on personnel to review vast numbers of uploads would be obviously costly and ineffective, and UGC services should keep up with the pace of advancing technology in policing their uploads for genuine copyright infringement.
Conversely, too, the Guidelines are wise to allow UGC services to use personnel to review uploads in cases where the application of technology is not leading to the best results. Where "fair use" is at issue, the decision may ultimately be a judgment call that only a person can make.
Unfortunately, however, allowing actual human beings to decide "fair use" issues is virtually the only way in which the Guidelines cut in favor of "fair use." In every other way, they cut against it in practice, while still, in several instances, paying lip service to the idea.
The First Threat to "Fair Use": Filtering of Uploads
Here are several key ways in which the Guidelines put "fair use" in jeopardy:
First, the Guidelines advocate filtering content at the upload stage, not once it has already appeared on the UGC service. The obligation, as the Guidelines put it, is to "block… content [that falls under known copyrights] before that content would otherwise be made available."
The pragmatic reason for this rule is clear: When infringing content is uploaded, and can be copied, any later remedies may be, in effect, closing the barn door once the horses are already gone.
Yet the barn door/horses argument may prove too much, because arguably the entire Internet is the barn, and the door will always be open somewhere. If popular sites filter uploads, then less popular sites may become more popular by employing retrospective remedies (that is, by searching what is uploaded, rather than filtering and blocking uploads) and thus hosting sexier, more cutting-edge "fair use" content that cannot be found elsewhere. Moreover, scofflaw and offshore sites may become popular by simply promising not to filter or block. If MySpace becomes tame or if there are myriad complaints about blocking, then fickle teens could easily switch their allegiance to another site.
Thus, the upshot of the decision to opt for filtering of uploads may be to simply harm MySpace and its users, while offering copyright owners no meaningful protection. (Why, then, did MySpace agree to filtering? The reason may simply have been the fear of an incredibly costly vicarious and contributory copyright infringement suit. It's not a defense to infringement that the infringing material is easily available all over the rest of the Internet.)
Does filtering actually hurt MySpace users? Heck, yes -- and it will predictably hurt the "marketplace of ideas" too. When it comes to free speech, delay in dissemination can be disastrous: Suppose a MySpace user wants to upload a commentary on the previous night's Presidential debate that makes "fair use" of copyrighted material. The ability to upload that same commentary a week later, when the news cycle has moved on, is far less valuable from a free speech perspective. Thus, the choice of filtering ahead of time prevents infringement, but only at the cost of inhibiting speech.
Granted, the Guidelines also say that "Copyright Owners and UGC Services should cooperate in developing reasonable procedures for promptly addressing conflicting claims with respect to Reference Material and user claims that content that was blocked by the Filtering Process was not infringing or was blocked in error." (Emphasis added). But what does "promptly" mean, exactly?
I wouldn't be very optimistic about the chance of a truly prompt resolution here. After all, a wise UGC service would need to get attorneys involved at some point, with certain materials, since a "fair use" determination is ultimately an instance of application of law to fact. And everyone knows that as soon as attorneys are involved, things tend to proceed quite slowly. Granted, the use of attorneys is costly, so a more realistic and likely solution is to have staff make the call in the first instance, and then pass difficult issues on to attorneys. Still, in difficult cases, when attorneys are indeed involved, time may tick away.
After all, from the attorney's perspective, quickly approving copyright-infringing material, based on an erroneous call that it is "fair use" may result in a multimillion dollar malpractice suit when millions of users view the infringing material and the copyright owner sues the UGC service. (Other legal questions remain, as well; in some cases, the Digital Millennium Copyright Act's safe harbor for sites that simply host content may protect a UGC service site, even if the "fair use" exception to copyright law does not.)
By comparison, if MySpace were a government entity controlled by the First Amendment, it simply would not be able to pre-filter speech this way. Rather, because "prior restraints" are disfavored, it would have to allow speech to happen, and then later order the speaker to pay damages if necessary. Obviously, MySpace is not a government entity, so the First Amendment does not apply. But it's still notable when, in the free speech arena, private companies agree to do something that a government agency could not impose. Moreover, it's also worth noting that if we have a modern town square for Generation Y, it's probably MySpace -- suggesting that private entities now operate the kind of forum that the government once hosted, and in which First Amendment rules once applied.
The Second Threat to "Fair Use": The Sweep-In System
Second, the Guidelines adopt a "sweep in" system that says material cannot be licensed unless the copyright owner expressly says it can be. More precisely, they state that if the copyright owner is silent, then the UGC service "should block content" that matches the copyrighted material. In order to allow certain users to avoid the blocking, the copyright owner can provide a "white list," but if it does not, then all would-be users are out of luck.
In litigation I discussed in a prior column, the Stanford Center for Internet and Society (CIS) challenged the current U.S. copyright system insofar as it automatically sweeps in all material -- even a napkin doodle -- unless the author expressly states an intention to have the material come under a "creative commons" license. CIS argued in that case that the "sweep in" system was a mistake, and similar arguments can be made here as well.
Few copyright owners, for example, are likely to take the trouble to affirmatively "white list" college students' class projects, but if there were categories to check off, one would hope that few would actually "black list" educational uses either.
In sum, inertia is a powerful force, and the Guidelines' inertial pull is in favor of filtering content that will then never see the light of day (or emerge only after the event on which it comments is long past). That pull is also in favor of automatically blacklisting content that copyright holders might happily white-list if they were required to focus on the issue. In the Internet's new ocean of content, we deserve different -- and more pro-free-speech -- tides.