GW Center for Law and Technology- Speaker Series: Chinmayi Sharma
Schedule
Thu Apr 10 2025 at 12:00 pm to 01:00 pm
UTC-04:00Location
Faculty Conference Center, 5th Floor | The George Washington University Law School | Washington, DC

About this Event
Join the GW Center for Law and Technology for our second speaker series lecture of the Spring 2025 semester.
Chinmayi Sharma is an Associate Professor at Fordham Law School. Her research and teaching focus on open internet governance, cybersecurity, artificial intelligence accountability, computer crime, and software liability.
She is an advisor to the American Law Institute’s Principles of Law, Civil Liability for Artificial Intelligence and a member of the Microsoft Responsible AI Committee. She is a Non-Resident Fellow at the Strauss Center, the Center for Democracy and Technology, the Atlantic Council, and the Institute for Law & AI. She is a member of the program committee for the ACM Symposium on Computer Science and Law and the ACM Conference on Fairness, Accountability, and Transparency (FAccT).
She is on the Lawfare masthead and has been quoted by the New York Times, NPR, ProPublica, Law360, Bloomberg, and Bloomberg Law. Her Article calling for professionalization of AI engineers, “AI’s Hippocratic Oath,” was featured in the New York Review and the Legal Theory Blog. Her Article on open source software security, “Tragedy of the Digital Commons,” has been included in the Hague's International Cyber Security Bibliography and featured in Schneier on Security. Before joining academia, Chinmayi worked at Harris, Wiltshire & Grannis LLP, a telecommunications law firm in Washington, D.C., clerked for Chief Judge Michael F. Urbanski of the Western District of Virginia, and co-founded a software development company.
Location: Faculty Conference Center. Lunch will be provided.
Paper abstract: "Brokering Safety"
"For victims of abuse, safety means hiding. Not just hiding themselves, but also hiding their contact details, their address, their workplace, their roommates, and any other information that could enable their abuser to target them. Yet today, no number of name changes and relocations can prevent data brokers from sharing a victim’s personal information online. Thanks to brokers, abusers can find what they need with a single search, a few clicks, and a few dollars. For many victims, then, the best hope for safety lies in obscurity—that is, making themselves and their information harder to find.
This Article exposes privacy law’s complicity inthis phenomenon of “brokered abuse.” Today,victims seeking obscurity can ask data brokers to remove their online information. But a web ofprivacy laws props up a fragmented and opaque system that forces victims to navigate potentially hundreds of distinct opt-out processes, wait months for their information to be removed, and then repeat this process continuously to ensure their information doesn’t resurface. The status quocompels victims to manage their own privacy, placing the burden of maintaining obscurity on already-overburdened shoulders.
In response, this Article pitches a new regulatory regime premised on a transformative reallocation of responsibility. In short, it proposes a techno-legal system that would enable victims to obscure their information across all data brokers with a single request, redistributing the burden away from victims and onto brokers. Such a system is justified, feasible, and constitutional—despite what brokers might say. The industry is eager to assert that it has a First Amendment right to exploit people’s data, but this Article develops a trio of arguments to confront this controversialclaim of corporate power. By blending theory, policy, and technical design, this Article charts a path toward meaningful privacy protectionsfor victims and, ultimately, a more empathetic legal landscape for those most at risk."
Where is it happening?
Faculty Conference Center, 5th Floor | The George Washington University Law School, 716 20th Street Northwest, Washington, United StatesEvent Location & Nearby Stays:
USD 0.00
