- in United States
- within International Law, Employment and HR and Finance and Banking topic(s)
In Patterson v. Meta Platforms, Inc., the New York Appellate Division, Fourth Department, held in a split decision that Section 230 of the Communications Decency Act ("Section 230")––a federal statute that prohibits treating providers of "interactive computer services" as publishers of third-party content ––barred tort causes of action that sought to hold several social media companies liable for their algorithmic content recommendations. The Patterson majority held that content-recommendation algorithms, which platforms use to sort and display third-party content, are (1) publishing activity immunized under Section 230 and (2) editorial decisions protected under the First Amendment, in light of the U.S. Supreme Court's recent decision in Moody v. NetChoice, LLC. The Fourth Department explained that holding to the contrary would "be inconsistent with the language of section 230," "eviscerate" its "expressed purpose," and "result in the end of the Internet as we know it." Patterson adds to a growing list of courts that have disagreed with the U.S. Court of Appeals for the Third Circuit's decision in Anderson v. TikTok, Inc., which held that platforms engage in first-party speech unprotected by Section 230 when they use "expressive algorithms" to curate third-party content.
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.