By James Rosenfeld
Apple’s iPhone may currently be the most popular platform for which third parties can develop software applications, but it has growing company from Google, RIM, and others. Tablets like the Apple iPad and, to a lesser extent, e-readers like the Amazon Kindle also host third-party apps. Consumer electronics manufacturers have also devised platforms that run on digital televisions, Blu-ray players, gaming consoles, and separate set-top boxes like the Boxee Box, with several manufacturers unveiling new or planned platforms for third-party apps at this year’s Consumer Electronics Show. Third-party applications may soon appear on almost any imaginable consumer electronics device.
The range of applications available on phones, tablets, and other devices is staggering. An iPhone user can soak up all types of media, from e-books and periodicals, to music to movies to games; consult information and reviews on restaurants and shows; keep track of calories, mileage, or stocks; check Facebook, Twitter, tumblr, or Foursquare; and create, share, and mash up images, video, and music, among hundreds of thousands of iPhone apps. Apple’s App Store has notched billions of downloads. While the repertoires of apps available on Google’s Android Market, Nokia’s Ovi Store, RIM’s BlackBerry App World, and other systems are much smaller, these and other providers are racing to compete in the app marketplace.
This profusion of applications requires analysis of possible liability: Can the providers of operating systems and other systems software – the Apples, Googles, and RIMs of the world – be held liable for the content provided or transmitted on the many apps they make available to users? Some apps may raise more red flags than others: an iPhone-toting bar-crawler can employ his “Can I Drive Yet” application to determine whether his blood alcohol content is legal and then click on “Trapster” to avoid police speed traps and red light cameras on his way home. Yet the seemingly limitless ways in which content can be provided through more run-of-the-mill apps, e.g., through file-sharing, reviews, comments, mash-ups and the like, create the potential for intellectual property, libel, and privacy claims, among others.
Providers can and should minimize their exposure through their agreements with developers. They can set restrictions on types of content, require representations and warranties that developers will abide by such limits, require indemnification for liability based on forbidden uses, and set up an approval regime. Providers can also attempt to pass on or shield themselves from liability vis-à-vis users by incorporating content rules, indemnification provisions, and the like into their terms of service. While these mechanisms can reduce exposure to liability, they cannot prevent plaintiffs from attempting to assert claims – meaning that notwithstanding contractual language, courts will be asked to decide the liability of those who operate the platforms or operating systems on which apps exist for intellectual property infringement, invasion of privacy, libel, or other content-related torts.
This article focuses primarily on whether the two “safe harbors” that have been created to protect hosts of user-generated online content – in Section 512 of the Digital Millennium Copyright Act (DMCA) and Section 230 of the Communications Decency Act (CDA) – are likely to protect platform providers from liability for the content of apps. Test cases have been launched:
- In one case, a game developer claimed that another developer copied his game in the form of an app which ran on Facebook. He sued not just the other developer but Facebook, for copyright infringement and other claims. While the court initially dismissed the claims against Facebook for failure to state a claim, it subsequently permitted a beefed-up contributory infringement claim to go forward against Facebook. Facebook asserted the DMCA safe harbor defense, the claims were dismissed pursuant to a settlement.
- In another pending case, an individual who records nature sounds asserted a copyright claim (lumping together direct and indirect theories of liability) against the creator of a series of iPhone apps allegedly containing his bird sounds (the “iBird applications”) – and against Apple for distributing the app. Like Facebook, Apple has asserted DMCA safe-harbor defenses, which have yet to be litigated.
- In a third case, a putative class of iPhone and iPad users have sued Apple and a number of app developers, alleging that the developers paired unique device identification numbers with other information gleaned through the apps, and transmitted the combined information to advertising networks, and that Apple is liable because it exerts significant control over the development process and thus aids and abets this illicit taking and transmission of personally identifying information. Apple has not yet answered the complaint, so it remains to be seen whether it will assert Section 230 as a defense.
What we know so far about Section 230 and DMCA § 512 suggests that online providers can screen submissions, make copies of third-party content for purposes of displaying and transmitting it to users, alter the format of such content, allow users to view or download it, sell advertisements against the content, or even sell the content itself to users (so long as their business model isn’t focused on gaining profit from the exploitation of bootlegs or otherwise illegal copies) and even have generalized knowledge of infringing material on their services. But they risk losing some or all of these protections if their businesses are built on selling illegal content, if they are actively involved in creating or eliciting illegal third-party content, or if they look the other way on receipt of specific notice of infringement. It remains to be seen whether and when setting development milestones and standards and otherwise substantially involving themselves in the app review process crosses that line. Some of the pending cases may further clarify the picture.
Copyright Infringement and DMCA § 512
As the cases mentioned above show, plaintiffs may seek to impose primary or secondary liability for copyright infringement based on content disseminated within apps (by their developers) or through apps (by device users). Putting aside traditional copyright defenses, which of course may apply, we focus here on the safe harbor established by DMCA § 512(c), which bars money damages and limits injunctive relief for copyright infringement claims against service providers based on “Information Residing on Systems or Networks At [the] Direction of Users” – or user-generated content.
To take advantage of this safe harbor, service providers must not be aware of the presence of infringing material on their systems or networks or know any facts or circumstances that would make infringing material apparent, or (upon obtaining such knowledge or awareness) must act expeditiously to remove the purported infringing material (§ 512(c)(1)(A)); must not receive a financial benefit directly attributable to infringing activity, where the provider has the right and ability to control such activity (§ 512(c)(1)(B)); and, upon notice of claimed infringement, must respond expeditiously to remove or disable access to the material (§ 512(c)(1)(C)). Service providers must also designate an agent to receive notice of claimed infringement on their websites (§ 512(c)(2)), and must comply with other provisions of § 512 requiring accommodation of “standard technical measures” and the removal of repeat infringers (§ 512(i)). The application of this safe harbor to the app world raises several key issues which are discussed below.
The Southern District of New York District Court’s opinion in last year’s YouTube case is helpful to platform providers in several ways. Facing the Viacom entities’ claims of direct and vicarious infringement claims against Youtube and Google based on Youtube’s hosting of thousands of copyright-protected excerpts of Viacom television programs, the court granted summary judgment for defendants based on the § 512(c) safe harbor. As discussed below, although YouTube’s services can be distinguished from Apple’s, the court made several points – particularly regarding the knowledge of a provider necessary to overcome the safe harbor and the nature of the functions that a provider may engage in without losing the protection – which are helpful to platform providers. However, a number of issues remain.
Is the provider of a platform through which users can download and use apps a “service provider” under § 512(c)? Section 512 defines “service provider” as “a provider of online services or network access, or the operator of facilities therefor,” which has been construed to include almost any website or other Internet-based entity. The provision’s legislative history states that this includes, “for example, services such as providing Internet access, e-mail, chat room and web page hosting services,” and even educational institutions, to the extent they perform such functions. Aside from websites, the Perfect 10 decisions indicate that providers of payment processing services, age verification systems for adult websites, web hosting, and related Internet connectivity services all qualify for such protection. The legislative history states that a “broadcaster or cable television system or satellite television service would not qualify” unless it performs functions described by the statute,  although as these entities have evolved, many of them now provide “online services or network access” – in which case they too would qualify as “service providers” eligible for the § 512(c) safe harbor. Apps are downloaded from online marketplaces like the Apple App Store and Blackberry App World, but even after that most rely on continued internet contact. For instance, apps that report news or other up-to-date information (Wall Street Journal, Weather Channel Max, MLB at Bat, Flight Track Pro) or rely on continuous access to an online database (Netflix, Yelp, Epicurious) would clearly seems to provide “online services” and thus be “service providers” under § 512.
It is not clear whether apps that, after being downloaded, require little or no Internet connectivity – these range from some games and novelties (Scrabble, Flight Control, Koi Pond) to art and music applications (Sketch Book Pro, Pianist Pro) – fall within the scope of the statute. (On the other hand, these self-contained apps may need the safe harbor far less; while they may appear on the platform “at the direction of a user” (as discussed in the next section), they do not permit device users to upload and exchange content – so the content that appears on these apps can be vetted by the provider and thus regulated much more easily). Providers may assert other defenses as to these apps, but if they are not online service providers they may not take refuge in the DMCA safe harbor.
Does an app reside on a platform “at the direction of a user”? Even if the operator of a platform qualifies as a service provider, apps must also reside on their system or network “at the direction of a user” in order to qualify for § 512(c) protection. An app created entirely by a developer without the provider’s involvement would fit this bill. However, if providers receive directions from and/or work closely with developers on the creation of apps, the apps might be deemed outside of the scope of this language; in other words, a court might hold that the platform is providing content rather than distributing content at a user’s direction. The case law makes clear both that providers can review, copy, and recast (in a different format) third-party content and that they can display, transmit, or otherwise facilitate user access to that content without exceeding the bounds of Section 512(c). Still, it is not yet clear at what point a provider’s involvement in the development of an app becomes substantial enough to remove it from this scope. Thus, it remains to be seen whether a provider that helps conceive of an app, sets development milestones or standards, reviews and gives feedback on plans or prototypes, and/or optimizes apps for its own system is merely hosting content “at the direction of user” and is therefore eligible for the safe harbor of Section 512(c). Generally, the less involved providers are in conceptualizing or developing the content of apps, the more likely it is that the safe harbor will apply.
Do providers have actual knowledge of infringement or are they aware of “red flags?” One potential barrier to DMCA protection is the issue of whether platform providers have sufficient knowledge of indicating infringing activity, which could take them outside of the statute’s protection. One court stated:
The DMCA was enacted both to preserve copyright enforcement on the Internet and to provide immunity to service providers from copyright infringement liability for passive, automatic actions in which a service provider’s system engages through a technological process initiated by another without the knowledge of the service provider. This immunity, however, is not presumptive, but granted only to innocent service providers who can prove they do not have actual or constructive knowledge of the infringement, as defined under any of the three prongs of 17 U.S.C. § 512(c)(1). The DMCA’s protection of an innocent service provider disappears at the moment the service provider loses its innocence, i.e. at the moment it becomes aware that a third party is using its system to infringe. At that point, the Act shifts responsibility to the service provider to disable the infringing matter….
Thus, as discussed above, service providers must neither have actual knowledge of infringement or of “red flags” indicating infringement, and if they do have such knowledge they must act expeditiously to address the problem
The YouTube case addressed this issue thoroughly, holding that the requirement that there be either “actual knowledge” of infringing activity or “facts or circumstances” indicating such activity calls for “knowledge of specific and identifiable infringements of particular individual items” rather than knowledge of a generalized practice of infringement or a proclivity of users to post infringing materials. Nor, the court made clear, is the safe harbor conditioned on a provider monitoring the content placed on its service, seeking facts indicating infringing activity or even investigating upon receipt of a generalized objection from a third party. The burden of policing stays entirely with the user.
Even though trademark infringement is not covered by either of the statutes addressed here, the 2nd Circuit’s decision in Tiffany v. eBay creates something like a common-law equivalent to the DMCA safe harbor for trademark infringement. The court affirmed the dismissal of trademark claims against eBay based on eBay’s advertising and listing of counterfeit Tiffany merchandise, holding that a service provider must have knowledge of specific infringing activity rather than “a general knowledge or reason to know that its service is being used to sell counterfeit goods.” The YouTube Court acknowledged that the DMCA applies the same principle “by a different technique.”
The fact that some platform providers have rigid approval mechanisms is both completely understandable and potentially detrimental on this score. For example, Apple reserves the right in its developer agreements to reject application that contain obscene, pornographic, defamatory, or otherwise objectionable content, and has stated in response to an FCC inquiry that its 40-plus full-time reviewers “review every application…in order to protect consumer privacy, safeguard children from inappropriate content, and avoid applications that degrade the core experience of the iPhone.” While app providers like Apple may nonetheless benefit from the safe harbor – particularly with respect to user-generated content posted on apps – public statements could make it more difficult to argue that the company lacked knowledge or missed red flags as to the developer-generated content that exists on the app from its inception. Freeman, the class action against Apple and its developers mentioned above, may test these waters.
Do providers have sufficient financial benefit and control that they lose the DMCA immunity? Another issue is whether providers receive a financial benefit directly attributable to infringing activity on their systems. Under Section 512(c)(1)(B), if a provider receives “a financial benefit directly attributable to the infringing activity” on its system or network and has the right and ability to control such activity, it loses the DMCA safe harbor for user-generated content.
It seems on its face that a platform provider like Apple (with significant involvement in the development process) would have the right and ability to control content provided on its apps, while less-involved providers like Google may not. Although the case law requires something more than the ability to block or remove access to posted materials, Apple and some other providers exert significant control over developers’ content through the contractual restrictions and screening processes they impose on developers. However, even as to Apple, YouTube suggests that content submitted onto or through apps by app users, rather than developers, would not satisfy this element absent specific notice by the copyright owner. The “right and ability to control” the activity requires item-specific knowledge of infringement.
Providers also clearly recognize financial benefit, typically demanding 30 percent of the revenue of paid apps on their systems, charge developers license fees and sell their devices. (Developer fees and increased sales accrue even to developers of free apps). The legal question is whether, if infringing activity takes place, this financial benefit is “directly attributable” to that infringement in the sense intended by § 512(c)(1)(B).
Legislative history counsels for “a common-sense, fact-based approach, not a formalistic one “in assessing whether a provider’s benefit is directly attributable to infringing activity. “In general, a service provider conducting a legitimate business would not be considered to receive a ‘financial benefit directly attributable to the infringing activity’ where the infringer makes the same kind of payment as non-infringing users of the provider’s service.” Thus, services requiring “a one-time set-up fee and flat, periodic payments for service” or “fees based on the length of the message…or by connect time” may be permissible, while services that would charge more for access to infringing material would not. Nimmer suggests that charging much higher fees for access to bootlegged material would cross this line. Highlighting the availability of potentially infringing content in marketing or advertising materials or other communications would also risk liability. While these examples clearly cross the line, the line itself is nonetheless fuzzy. Where a provider realizes revenues from both infringing and noninfringing songs/article/images, will courts regard the infringing profits as “directly attributable to the infringing activity”? Despite the narrow interpretations cited above, courts may view “directly attributable” in a broader sense, raising the risk of providers’ liability for hosting infringing content.
Section 230 and Treating Platform Providers as Publishers of Apps
Plaintiffs may also assert libel, privacy, and other publication-based torts against providers of operating systems and other systems software based on content disseminated in or through apps on those systems. Section 230 of the CDA provides immunity to providers on most such claims.
Section 230(c)(1) establishes another safe harbor immunizing providers from liability for third-party content, providing in full: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” Section 230 sets certain exceptions to this rule, stating that the safe harbor shall have no effect on, inter alia, any federal criminal statute, any consistent state law or any law pertaining to intellectual property. (CDA § 230(e)). The issues here are similar to those likely to come up in the DMCA context.
Are platform providers and app developers “interactive computer services” and “information content providers,” respectively? It seems certain that platforms and networks are generally “interactive computer services” and app developers are “content providers.” An interactive computer service is “any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server, including specifically a service or system that provides access to the Internet….” (§ 230(f)(2)). One of the terms in that definition, “access software provider,” is further defined to mean “a provider of software…or enabling tools that do any one or more of the following: (A) filter, screen, allow, or disallow content; (B) pick, choose, analyze, or digest content; or (C) transmit, receive, display, forward, cache, search, subset, organize, reorganize, or translate content.” (§ 230(f)(4)). It would be difficult to imagine an app that does none of these things – they all transmit or display content or perform one of the other listed functions – although, again, the conclusion is less clear with respect to apps that simply function on a device without Internet contact.
An “information content provider” is “any person or entity that is responsible, in whole or in part, for the creation or development of information provided through the Internet or any other interactive computer service.” (§ 230(f)(3)). An app developer fits this definition because he or she creates or develops content provided through the Internet and/or the platform itself.
When is a provider being “treated as the publisher or speaker” of an app? Section 230’s immunity therefore bars any platform provider from being “treated as the publisher or speaker” of any content “provided by” an app creator. The critical issues in determining how the safe harbor applies in this context are (1) what types of claims treat the provider as a publisher or speaker of the app’s content, and (2) when is the content “provided by” the developer (rather than the provider), or more pertinently, when is the provider so involved in the creation of the app that it should be deemed the co-creator rather than merely the publisher of content created by another information content provider?
As to the first issue – which claims treat the provider as a publisher or speaker of third-party content – it is well established that Section 230’s safe harbor applies to libel and other torts arising out of the content of the published information,  subject to certain narrow exceptions. Certain of the exceptions are unambiguous, e.g., it is clear that section 230 may not immunize a provider from liability under a federal criminal statute (such as laws relating to obscenity or the sexual exploitation of children) or the Electronic Communications Privacy Act.  However, there is considerably more room for argument over the exception for “any law pertaining to intellectual property.” While the intellectual property exception takes federal copyright, patent, or trademark claims out of the sweep of Section 230’s federal immunity (although copyright claims enjoy the benefit of the DMCA safe harbor), courts have disagreed whether Section 230 provides immunity to intellectual property claims grounded in state law, and to whether – if it does cover state law IP claims – the right of publicity should be deemed intellectual property and thus right of publicity claims should fall within the intellectual property exception (and thus outside of the scope of the immunity, and thus actionable). The nature of the claim and a plaintiff’s choice of jurisdiction can therefore affect the applicability of Section 230 significantly.
The other vexing issue influencing Section 230’s application to apps is whether or not the third-party content is “provided by” not just the developer of the app but the platform provider, where the provider takes an active role in shaping its content. Section 230 is broadly construed. Congress chose to grant the immunity “even where the interactive service provider has an active, even aggressive role in making available content prepared by others.” Thus, where a provider merely selects material for publication or even where it engages in some editing, Section 230 nonetheless applies. On the other hand, if the provider’s involvement is extensive enough to make it the (or a) creator or developer of the content, then it does not enjoy immunity. Just as with the DMCA self harbor, the more a provider involves itself in conceiving of, developing or giving feedback to developers on apps, the less likely there will be immunity. Providers should be aware that if they are deemed either to solely create or materially contribute to illegal content, they will likely qualify as both service providers and content providers, and may be held liable.
The providers of platforms on which third-party apps are uploaded and downloaded should be aware of the potential liability and safe harbors for third-party content. Most will minimize their risk through developer agreements and careful screening, but none can anticipate the full range of uses and content that may appear on apps, or prevent plaintiffs from asserting claims based on such third-party content.
Providers may be eligible for protection from copyright liability under Section 512(c) of the DMCA, at least as to apps that provide online services and that exist on their system “at the direction of a user” rather than as a product of their own creative involvement, if they follow the various notice and takedown procedures that the regime requires. However, given providers’ typically rigorous screening of submissions, it may be difficult to establish the requisite lack of knowledge or awareness of infringement to qualify for the safe harbor. Furthermore, to avail themselves of this safe harbor, providers should take pains to avoid realizing financial benefit “directly attributable to” infringement.
Section 230 of the CDA will bar most libel and other publication-based claims, but it is important to note the exceptions to this statute. The intellectual property exception from this federal immunity is particularly ill-defined and hotly contested, creating uncertainty as to whether state intellectual property or right of publicity claims are protected. (Note that federal and possibly state trademark claims are expressly carved out of the immunity and – unlike copyright claims – do not enjoy the benefit of the DMCA either). Also, as with the DMCA safe harbor, providers of platforms that become involved in the development of apps beyond minor editing risk being seen not just as service providers but as content providers, and thereby losing the protection of Section 230.
In sum, active involvement in the development of apps could keep platform providers out of both safe harbors. Platform providers have broad latitude to review, screen, display, send, alter the format of, sell (or sell ads against) user-generated content so long as their business isn’t centered on making infringing content available for viewing, use, or sale and if they are not directly eliciting or participating in the creation of illegal content (but the line between proper and improper involvement remains murky). Failure to respond to sufficiently specific notice may sacrifice the DMCA’s protection, although not Section 230’s. Excessive involvement in the development process or failure to respond appropriately to such notice may make for a truly killer app.
 A special wrinkle arises with government entities, which have in some cases negotiated exceptions to such TOS indemnities based on federal or state constitutional provisions which immunize them from liability. See, e.g., http://www.coloradoan.com/article/20110105/UPDATES01/110105014/Colorado-Facebook-reach-deal-over-legal-indemnity.
 Miller v. Facebook, No. 5:10-CV-00264-PVT, U.S. District Court, N.D. Cal.
 Stewart v. Apple, Inc., No. 2:10-cv-01012-RSL, U.S. District Court, W.D. Wash.
 Freeman v. Apple, Inc., No. 5:10-cv-05881-HRL, U.S. District Court, N.D. Cal.
 Viacom International Inc. et al. v. Youtube et al., 718 F. Supp. 2d 514 (S.D.N.Y. 2010) (on appeal).
 H.R. Rep. No. 105-551(II), at 64 (1998).
 Perfect 10, Inc. v. CCBill, LLC, 340 F. Supp. 2d 1077 (C.D. Cal. 2004); Perfect 10, Inc. v. CCBill, LLC, 488 F.3d 1102 (9th Cir. 2007).
8 H.R. Rep. No. 105-551(II), at 64.
 See Costar Group Inc. v. Loopnet, Inc., 164 F. Supp. 2d 688, 701 (D. Md. 2001) (users’ real estate photos, screened by defendant’s employees, were nonetheless deemed stored “at the direction of a user”), aff’d, 373 F.3d 544 (4th Cir. 2004); Io Group, Inc. v. Veoh Networks, Inc., 586 F. Supp. 2d 1132, 1146-48 (N.D. Cal. 2008) (Internet television network “processe[d] user-submitted content and recast[ed] it in a format…readily accessible to viewers, although it did not preview or select files; held not stored “at the direction of a user”); UMG Recordings, Inc. v. Veoh Networks, Inc., 620 F. Supp. 2d 1081, 1086-92 (C.D. Cal. 2008) (Internet television network reproduced works in different formats and allowed users to access them by streaming or downloading; held not stored “at the direction of a user”).
 See Viacom v. Youtube, 718 F. Supp. 2d at 526-27 (“’the transmission, routing, or providing of connections for digital online communications’ are within the safe harbor; quoting § 512(k)(1)(B) (definition of “service provider”)); Io Group, 586 F. Supp. 2d at 1148) (“means of facilitating user access to material” on a provider’s website do not forfeit safe harbor).
 See Costar, 164 F. Supp. 2d at 702 (“Although humans are involved rather than mere technology, they serve only as a gateway and are not involved in a selection process.”).
 Perfect 10, 340 F.Supp. 2d at 1086 (quoting legislative history; citations and quotations omitted; emphasis added).
 Viacom v. Youtube, 718 F. Supp. 2d at 523.
 Id. at 524, 525.
 Tiffany (NJ) Inc. v eBay Inc., 600 F.3d 93 (2d Cir. 2010).
 Id. at 107.
 718 F. Supp. 2d at 525.
 “Apple Answers the FCC’s Questions,” htttp://www.apple.com/hotnews/apple-answers-fcc-questions/ (printed January 26, 2010).
 See n.4 supra.
 E.g., IO Group, 586 F. Supp. 2d at 1151.
 Viacom v. YouTube, 718 F. Supp. 2d at 527.
 H.R. Rep. No. 105-551(II), at 54.
 Id. See also Costar Group, 164 F. Supp. 2d at 704-705; but see Perfect 10, 488 F.3d at 1117-18 (stating in dictum that relevant inquiry is “whether the infringing activity constitutes a draw for subscribers, not just an added benefit”).
 3 Nimmer § 12B.04[A][b] at 12B-55.
 See nn. 10-12, supra.
 See Barnes v. Yahoo!, Inc., 570 F.3d 1096, 1101-02 (9th Cir. 2009) (“What matters is not the name of the cause of action – defamation versus negligence versus intentional infliction of emotional distress –what matters is whether the cause of action inherently requires the court to treat the defendant as the ‘publisher or speaker’ of content provided by another.”).
 47 U.S.C. §§ 230(e)(1), (4).
 47 U.S.C. § 230(e)(2).
 See Bruce E.H. Johnson and Kevin L. Vick, “Section 230 Immunity and State Right of Publicity Claims,” MLRC Bulletin 2008:4 (at http://www.dwt.com/portalresource/lookup/wosid/intelliun-1501-9202/media.pdf) at 69-76.
 Blumenthal v. Drudge, 992 F. Supp. 44, 52 (D.D.C. 1998).
 Batzel v. Smith, 333 F.3d 1018, 1031 (9th Cir. 2003) (provider’s “minor alterations” to defamatory material contributed by third party do not “rise to the level of ‘development’” necessary for liability); Blumenthal, 992 F.Supp. at 52 (provider’s exercise of “editorial control” over defamatory third-party content fell within immunity).
 See Fair Housing Council v. Roommates.com, LLC, 521 F.3d 1157 (9th Cir. 2008) (en banc) (where operator posted questionnaire requiring answers allegedly in violation of the Fair Housing Act and state laws, CDA did not immunize operator from liability for discriminatory responses). And see generally http://www.dwt.com/LearningCenter/Advisories?find=21902.
 Courts and legislatures are grappling with issues of provider liability everywhere. Last year, the French Supreme Court ruled in the “Tiscali Media/Dargaud Lombard, Lucky Comics” case that Tiscali, a media company, was liable for the posting of unauthorized copies of comics on personal Web pages stored by Tiscali online. However, that decision was made under law that has been amended to comply with EU Electronic Commerce Directive (Directive 2008/31/EC). Under the Directive, ISPs may not be held liable for information transmitted, stored, or cached by them, subject to conditions that resemble the DMCA safe harbor provisions but shield ISPs from liability for not just copyright but defamation and other content-related torts. E.g., where an ISP stores information “provided by a recipient of the service,” it may not be held liable for many damages if it has no actual knowledge of illegal activity and “is not aware of facts or circumstances from which the illegal activity is apparent,” or when it acts expeditiously to remove or disable access to the information. (Id. Article 14).