Examining Court Backlogs and Section 230's Legal Implications
School
Cranbrook Schools**We aren't endorsed by this school
Course
ENGLISH YEAR2
Subject
Law
Date
Dec 10, 2024
Pages
7
Uploaded by EarlFoxPerson1258
Raymond 22 - Courts are just recovering after the pandemichttps://www.reuters.com/legal/government/pandemic-fueled-federal-court-case-backlog-less-than-expected-study-2022-12-02/But, the report found those backlogs on a national level were offset by a comparable 29% decline infederal prosecutors charging new defendants during the first two years of the pandemic and litigantsfiling 6% fewer civil cases during that period."Courts were able to effectively clear their dockets despitepandemic-related delays at least in part because fewer new criminal defendants and civil cases cameon to their dockets during the pandemic," the report said. Researchers found that while about two-fifthsof district courts emerged from the pandemic's second year with more pending cases than expected,ona national basis case backlogs had returned to pre-pandemic levels by March 2022.The study's authorsnoted their numbers for civil cases were ultimately just estimates as they excluded the thousands oflawsuits consolidated in multidistrict litigation proceedings due to a lack an established convention forcounting them. MDLs comprised over 70% of the federal civil caseload in 2021.Neschke et al 22 - Section 230 shields companies from lawsuitshttps://bipartisanpolicy.org/blog/gonzalez-v-google/On October 3, 2022, the Supreme Court announced that it would hear two cases that couldfundamentally change the future of the modern internet.Gonzalez v. GoogleandTwitter, Inc. v. Taamnehinvolve both the Anti-Terrorism Act andSection 230 of the Communications Decency Act— whichshields tech platforms from lawsuits for hosting and moderating user content.Section 230 is one of themost important laws in tech policy, and this will be the first time the Supreme Court has interpreted itsscope. Back in 1996, at the dawn of the internet age, Section 230 was created to encourage thedevelopment of the internet while fostering a safe online environment where users can connect andcivilly express themselves.Nabil 21 - Section 230 repeal leads to barrage of lawsuitshttps://cei.org/blog/why-repealing-section-230-will-hurt-startups-and-medium-sized-online-businesses/Proponents of Section 230 repeal seek to rein in Big Tech, but it will only add to the market leader’sadvantage over startups and second-tier tech companies—such as TripAdvisor, Reddit, and Yelp.IfSection 230 is repealed altogether, any tech companies can be subject to civil lawsuits for “anymaterial posted byanyuser.”As a result,repealing Section 230 is expected to result in a barrage offrivolous lawsuitsagainst tech companies.Terr 23 - more pretrial + discovery motions, longer court timeshttps://www.thefire.org/news/why-repealing-or-weakening-section-230-very-bad-ideaSection 230doesn’t just protect platforms from liability for unlawful content created by others: It alsofacilitates the prompt dismissal of frivolous lawsuits,often in cases that don’t even involve unlawfulspeech. Without Section 230,many of these lawsuits wouldstillcause platformsmajor headaches byrequiring themto engage in extensive discovery and pretrial motions.
ABC 08https://abcnews.go.com/TheLaw/story?id=5429227&page=1Critics, however, contend that theincreased number of cases strain an already burdened judicialsystem, depriving lawyers and judges of ample time to hear cases and denying defendants the right toa fair trial.Impact :1.Criminal JusticeCorral et al 22 - empirically court clogs delay hundred of thousands of casesA CBS News investigation has uncovered amassive backlog of court casesthat hasdelayedprogress onhundreds of thousands of criminal cases across the United States.CBS News obtained and analyzed datafrom courts and district attorneys' offices in more than a dozen major American cities and found"pending" criminal cases jumped from 383,879 in 2019, just before the COVID-19 pandemic, to546,727 in 2021. In California, New York, Florida and Michigan, the number of "pending" cases in 2021totaled nearly 1.3 million.The backlog has resulted in delayed justice for crime victims and theirfamilies and threatens to deny the constitutional right to a speedy trial for the accused.It alsoraisesconcerns of a possible public safety threat, with thousands of convicted criminals remaining free asthey await sentencing.VC 22 - Justice Delayed is Justice Deniedhttps://victimscommissioner.org.uk/news/enormous-court-backlogs-mean-victims-of-crime-are-facing-years-of-unacceptable-delay-in-their-quest-for-justice/These backlogs have real and tangible impacts on victims. Each one of these 59,000 cases is likely to haveat least one victim. From when a case enters the crown court system, for cases where the defendant isgoing to trial and has plead not guilty, cases are now taking almost 15 months to be completed. This is60% higher than two years earlier. Even before the pandemic campaigners were lamenting the poor stateof affairs. These latest numbers are wretched.Many victims and witnesses are simply opting-out of thecriminal justice process altogether, having gone into it to do their public duty and to seek justice. Thisleavesthem with no resolution andthe public with the risk of a guilty criminal free to offend again.This represents a fundamental threat to our justice systemBanning Section 230 means that companies would rather completelyun-moderate content to avoid liabilityHuddleston 22(Jennifer Huddleston is policy counsl at NetChoice and an adjunct professor at
George Mason University’s Antonin Scalia Law School. Before joining NetChoice, Jennifer served asthe Director of Technology and Innovation Policy for the American Action Forum and was a ResearchFellow at the Mercatus Center, January 31 2022, "Competition and Content Moderation," CATO,https://www.cato.org/sites/cato.org/files/2022-01/policy-analysis-922.pdf, DOA: 12/5/23)//ivanInstead of focusing on developing a product consumers desire, with appropriate content moderation and community standards, these platformswould default to designing their business model within the constraints of legal risk management and avoidance of the costs of needlesslitigation.In this way, removing or significantly modifying Section 230 would force platforms onceagain to face the so-called “moderator’s dilemma,” a situation that existed before Section 230 wasenacted. Platforms would be forced to choose between two equally unattractive alternatives. Theycould attempt to minimize their liability by scrutinizing every user post for its potential risks(likely delaying all posts for substantial periods of time, and significantly increasing the cost ofoperating their site). Or they could engage in no moderation whatsoever, in which case thepre–Section 230 common law established that they would not be liable for their users’content—but in which case, also, their users’ experience would be badly degraded.21This is not asolution that most audiences would appreciate. While a few unmoderated platforms like thenotorious 8chan exist, most users and platforms wish to avoid excessively graphic violence,pornography, online harassment, and equally offensive content.The most successful platforms featuringuser-created content are patronized by people who want to visit for connecting with friends and family, learning a new skill, or reading helpfulproduct and service reviews.22Content moderation enables these platforms to respond to their users’ wantsand concerns in both large and small ways.In the service of community standards, platforms can remove off-topic contentin a specific forum, limit or ban harassing users or content, and remove spam. As Reddit cofounder Alexis Ohanian tweeted, “What [platforms]all eventually learn is users WANT moderation.”23 While the compliance costs associated with the many proposed changes to Section 230 willbe most acutely felt by small businesses, this does not mean that consumers will not also bear higher costs. Since smaller internet players lackthe ability to absorb steeply higher costs of capital and compliance staff, these additional costs will be passed along to consumers in the form ofservice fees or reduced services.24 For example, after the General Data Protection Regulation (GDPR) was implemented in Europe, whichplaced additional compliance burdens regarding data privacy and security on firms, investment in small ad-tech firms was sharply reduced andthe market share of large ad-tech firms grew.25Goodman and Whittington 19 continues(Ellen P. Goodman is a professor at Rubager LawSchool, and a co-director and co-founder of the Rutgers Institute for Information Policy & Law, RyanWhittington is a Law Clerk at the German Marshall Fund Digital Innovation & Democracy Initiative.August 1, 2019, “Section 230 of the Communications Decency Act and the Future of Online Speech”,Rutgers Law School,https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3458442, DOA: 12/5/23)RWCEarly legislative efforts aimed at regulating the Internet were largely focused on the prevalence of pornography and otherwise unsavorycontent. The Communications Decency Act of 1996 marked the first attempt by Congress to limitthe spread of pornographic content online. As drafted, it “made it a crime, punishable by up totwo years and/or a $250,000 fine, for anyone to engage in online speech that is ‘indecent’ or‘patently offensive’ if the speech could be viewed by a minor.”7 The Supreme Court ultimatelyruled that the legislation was an unconstitutionally vague and overbroad violation of the FirstAmendment.Like the rest of theCommunications Decency Act, Section 230 reflected Congress’ interest in limiting harmful speech online. But,unlike otherprovisions of the Communications Decency Act that were struck down, Section 230 passedconstitutional muster.Representative Chris Cox and Wyden, drafted Section 230 in the shadow of two critical intermediary liabilitylower court cases from the 1990s.These early cases demonstrated the difficulties courts faced in assessing the liability of Internetplatforms for user-generated content. In Cubby v. CompuServe, the U.S. District Court for theSouthern District of New York held that the Internet firm CompuServe was not liable for
defamatory content on a discussion forum because it was passive in the chain of communications.There was no evidence demonstrating that it knew or should have known about the defamation.8 Inother words, the firm was treated as a mere distributor of content, like a bookstore, library, or newsstand.By contrast, in StrattonOakmont v. Prodigy Services, the New York Supreme Court ruled that the firm Prodigy was liablefor allegedly defamatory statements made by an anonymous individual who posted on a messageboard it hosted. 9 Prodigy, which hosted many online bulletin boards, actively moderated contentin an effort to make its platform fit for children and families. The court held that since it madeefforts to moderate, Prodigy was effectively exercising editorial control and should, therefore, beheld liable for unlawful content produced by users. Together these two rulings created a dilemmafor Internet platforms. If an Internet firm wanted to clean up and moderate its platform for thesake of improving user experience (or of mollifying politicians), it would open itself to enormouslegal risk.10Consequently, the safest course of action would be to forego moderation and act as apassive conduit for information. This, however, would leave platforms open to misinformation,sexually explicit content, and harassment.Conservative lawmakers, including Representative Cox, were troubled by the factthat Prodigy was effectively punished for taking efforts to tamp down sexually explicit content. Other lawmakers, like Senator Wyden, wereconcerned that holding platforms liable for users’ conduct would compel them to over-police user content, ultimately chilling online speech.This means in an affirmative world, you’re only going to see aproliferation of inflammatory contentSzabo 20Szabo, Carl. “Why Section 230 protects kids, and what its critics get wrong.”Protocol, 21 December2020, https://www.protocol.com/section-230-protects-kids. Accessed 4 January 2024.// // Esha B.Despite current debate over harmful content online and Section 230 of the Communications Decency Act, the truth is that Section 230 is the law that makes our internet a better place. Section230 is often blamed for all bad content and illegal activity on the internet, but under the law, any activity that's criminal offline is criminal online. In fact,Section 230provides no shield to criminals from enforcement of state, local and federal laws, whether theycommit their crimes on or off the internet.Take the horrific example of Backpage.com, an online platform that enabled sex trafficking online. In2018, the federal government seized control of the website, shut it down and threw its owners in prison. The federal government swooped in and enforced federal criminal law. In fact, Section230 was irrelevant in this case because the law provides no protection for platforms that contribute to criminal wrongdoing. The law also offers no protection for child exploitation or copyrightviolations. Similarly, Section 230 offered no protection for online platform Silk Road: an anything-goes marketplace where users could sell guns and drugs and even contract for murder. Thegovernment shut down this website and enforced criminal law on its owners because, again, Section 230 does not shield platforms from federal criminal law. These stories made headlines. Buttoday, critics incorrectly claim that Section 230 protects bad online platforms from the enforcement of major crimes.Platforms that host horrendous contentlike child pornography or sex trafficking should and can be dealt with by the law. When law enforcement fails toadequately police the proliferation of this content, we must discuss how to better ensure they do so. In a recent piece in Protocol, James Steyer and Bruce Reed claimed that Section 230 enablesonline harm. They argued that Section 230 protects bad platforms when they host content that exploits children. Thankfully, Steyer and Reed are entirely incorrect on this point.Section230 offers no protection against such exploitation and, in fact, makes clear to judges that theyshould not grant immunity in such cases.Steyer and Reed do raise some forms of content that platforms would not be held liable for hosting underSection 230, but we should not conflate the severity of content that is inappropriate for children with content that exploits children, like child pornography. If content is available on onlineplatforms that some consider inappropriate for children, then a discussion over how media of all types can do more to protect kids is necessary. Butupending theinternet as we know it by eliminating Section 230 as Steyer and Reed suggest is not thenuanced approach for such important discussions.Indeed,online platforms are able tomoderate and make themselves more appropriate for childrenbecauseof Section 230.
From YouTube Kids to forced opt-in tagging and screening for child-friendly content, YouTube works on new content moderation guidelines each day, dedicated to making our children safeonline. Without Section 230, YouTube has no shield behind which to help build its family-friendly practices.Section 230 saves platforms like YouTubefrom a "moderator's dilemma," where they would need to refuse to moderate content on theirsites to achieve court-awarded protections from conduit liability that predated Section 230, orend the practice of hosting user-created content easily and seamlessly.Groups like Common Sense Media, founded bySteyer, rely on Section 230 to protect the public. By providing parents with useful resources for identifying whether movies and shows are appropriate for their kids, Common Sense relies onparents and kids to add their own reviews and ratings.Without Section 230, they would risk assuming liability for allstatements made in these reviews if any of them were found to be defamatory.The world is a better place whenCommon Sense Media can host reviews that empower parents, and that world is best realized when Section 230 remains the law of the land. Steyer and Reed also mistakenly justify their drasticchanges by citing the only amendment to Section 230, a 2018 law called the Fight Online Sex Trafficking Act. FOSTA's supporters have provided little evidence of its efficacy; their mainargument, that the law enabled the FBI to takedown Backpage, is false because the FBI took down the site before FOSTA became law. FOSTA's author has also argued that it "eliminated 90percent of sex-trafficking ads." But these arguments have been decisively debunked by the Washington Post, and the track record of the law is particularly controversial. Many women's advocateshave raised concerns that the law has put women in greater danger, and news reports from San Francisco show that after the passage of FOSTA, law enforcement saw a surge in street-based sexwork and sex trafficking. FOSTA shows that blaming the failures of law enforcement to stop criminals with Section 230 is not only a factual error: It also risks harming victims and decimatingthe widespread benefits of free speech online. Since the inception of the internet, and even before the enactment of Section 230, any platform that engaged in an "anything goes" approach touser-generated content has been immune from the content of its users. For example, bookstores are not legally liable for any law-breaking content in the books they sell: the book's author is. Thesame is true for internet platforms.But without Section 230, as was the case before its passage, if an internetplatform engages in the type of content moderation required to protect children online, thisprotection evaporates and the platform suddenly becomes liable for every user's post.That leads to aclear disincentive for sites to moderate content.When Congress enacted Section 230, it empowered platforms toremove "lewd, lascivious, or otherwise objectionable" content without becoming liable for allcontent on their sites.Just like a good samaritan is not liable for harm caused by trying to save someone on the street, platforms are not liable for harm caused when theyseek to clean up their corners of the internet. In fact, the only websites that have little to lose from a repeal of Section 230 are those like 8chan and other content cesspools of the internet thatneglect to moderate content: Websites that actually focus on family-friendly content, like YouTube, KidsReads or Commons Sense Media, would likely not exist as they do today.IfSection 230 were repealed, then what parents fear most would come to fruition — anincreasein hate speech, violence, conspiracy videos and other harmful content online.Steyer and Reed are right to bringattention to corners of the internet where children are at risk, and we must look for ways to ensure law enforcement has the tools it needs to tackle child exploitation. But by looking at Section230 rather than at an increase in funding for enforcement against child exploitation, they are advocating for a policy that would severely underserve their goal of protecting children. Section 230is what has helped the internet become safer for children as it has grown and matured. As we search for ways to protect children from harm, we should all examine the true impacts of our policyprescriptions before presenting them as the golden ticket to a family-friendly future.I: Increase in Child Sex Crimes `In child porn, a majority of these victims are young girls–StatistaResearch department. 8 December2022,https://www.statista.com/statistics/1246257/gender-children-abused-child-pornography-content/. Accessed 15 December 2023//ivanThis graph showsthe distributionof children represented in child pornography content aroundthe world in 2019,by gender.It can be seen thatwhile a quarter of the victims were boys,the majority
(70 percent) were girls.Big Tech Profits off of such exploitation– Affirming would only be detrimental.LegalInformationInstitute.2008,https://www.law.cornell.edu/uscode/text/18/2251. Accessed 13December 2023//ivanThe importance of protecting children from repeat exploitation in child pornography:“(A)The vastmajority of child pornography prosecutions today involve images contained on computer hard drives, computer disks, andrelated media. “(B)Child pornography is not entitled to protection under the First Amendment and thus may be prohibited.“(C)The government has a compelling State interest in protecting children from those who sexually exploit them, and thisinterest extends to stamping out the vice of child pornography at all levels in the distribution chain. “(D)Every instanceof viewing images of child pornographyrepresents a renewed violation of theprivacy of the victims and a repetition of their abuse.“(E)Child pornography constitutes prima faciecontraband, and as such should not be distributed to, or copied by, child pornography defendants or their attorneys. “(F)It isimperative to prohibit the reproduction of child pornographyin criminal cases so asto avoid repeated violation and abuse of victims,so long as the government makes reasonableaccommodations for the inspection, viewing, and examination of such material for the purposes of mounting a criminal defense.”2. This leads to further mistreatment of women–Women's Media Center. 2023,https://womensmediacenter.com/speech-project/online-abuse-101. Accessed14 December 2023//ivanGender based harassment is marked by the intent of the harasser to denigrate the target on thebasis of sex.Itis characterized by sexist vitriol and, frequently, the expression of violence. When men face onlineharassment and abuse, it is first and foremost designed to embarrass or shame them. When women aretargeted, the abuse is more likely to begendered,sustained, sexualized andlinked to off-line violence. Women,the majority of thetargets of some of the mostsevere forms of online assault –rape videos, extortion, doxing with the intent to harm–experience abusein multi-dimensional ways andto greater effect.They are the vast majority of thevictims of nonconsensual pornography, stalking,electronic abuseand other forms of electronically-enhancedviolence.In addition,women report higher rates of finding online harassment stressful. This is not because they “can’t stand the heat,” as is frequentlysuggested, but because the abuse online exists simultaneously with three facts: Women have to be hyper-vigilant in daily life. Adouble digit safety gap offline has an online corollary. Women are more likely to experience more gendered and consequentialabuse. They are more frequently harassed, online and off, for sustained periods of times, in sexual ways and in ways that
incorporate stalking and manipulation. They are more likely to be pornographically objectified and subjected to reputationdamaging public shaming.Sexual slurs toward women evoke the threat of real-life sexual violence; they are also perceived as intended to“put a woman in her place” and tell her that her opinion is worthless because she is a woman.The abuse women experience online is intersectional.This is devastating as–I: cycle of violenceAbundez,Kelly.2022,https://www.oneworldeducation.org/our-students-writing/sexualization-of-women-in-media /. Accessed 14 December 2023//ivanSeveral sources draw a direct link between the sexualization of girlsin the media and sexual violence. According to the CDC, one of thecontributing factors to sexual violence is “exposure to sexually explicitmedia,”and according to UNICEF USA,“When women and girls are repeatedlyobjectified and their bodies hypersexualized, the media contributes toharmful gender stereotypesthat often trivialize violence against girls.”Sosexualization ofwomen through the media will lead to more sexual violence beingcommitted against them.