Abstract
In today’s digital age, social media platforms have become the “modern public square” for speech discourse in America. While users have the ability to express their speech online, all social media companies moderate the content that users post. While content moderation, or “the organized practice of screening user-generated content,” is supposed to enrich speech discourse and keep users safe, a plethora of research proves otherwise. Whether it is the silencing of minority voices or the rampant spread of misinformation online, it is clear that social media platforms have harmed their users. This project designates the lack of accountability social media platforms face as the primary reason for such failings. I ask the question: For the sake of American democracy and free speech principles, is there a rational basis for social media corporations to be the sole arbiters of content moderation? In tracing a complex history of platform law and utilizing three case studies, I make the case for regulating social media platforms. Accordingly, I propose what I believe to be the most productive form of regulation for the content moderation industry: a federal regulatory board.