OAJ Hot Take: Gender-based political violence, amplified by unregulated AI, threatens political representation progress
Content warning: This article contains a discussion of political and gender-based violence.
The piece below is part of our weekly blog post series written by the Open-Air Journal team where we explore issues at Heller, current events, or whatever is presently on our minds.
Entering the world of American politics has become an increasingly dangerous game. According to the United States Capitol Police, there were 8,008 threats against members of Congress last year – double the amount recorded in 2017. While this trend impacts all elected officials, women are at increased risk of harassment and violence, especially women of color. Social media and artificial intelligence (AI) technology enhance the severity and frequency of violence, while introducing new dangers. To continue making progress on political representation, policymakers must increase efforts to address gender-based violence and enhance accountability measures for perpetrators as well as for technology companies that enable violence and harassment.
Threats and violence against national-level politicians have consistently made headlines in recent years. In 2022, a man broke into the home of former House Speaker Nancy Pelosi (D-CA) and assaulted her husband with a hammer while asking “Where’s Nancy?” That same year, a man appeared at the home of Congresswoman Pramila Jayapal (D-WA), shouting threats while armed with a handgun. Alexandra Ocasio-Cortez (D-NY), a frequent victim of harassment, has resorted to 24-hour security and changing her sleeping locations. Several instances of gendered political violence in just the first month of 2024 have indicated that this trend is not letting up.
It is worth noting that political violence is not limited to national politicians. A 2023 study by Princeton University’s Bridging Divides Initiative surveyed local elected officials and found that threats and harassment remained high between September 2022 and August 2023. They also found that 40% of women (compared to less than one-third of men) experienced threats and harassment. Elected people of color faced similar levels compared to their white colleagues. Notably, the survey revealed that electeds often worried about threats and harassment, indicating that even a small number of high-profile attacks can create a fearful environment.
While many social and political factors are driving the increase in political violence, technology plays a major role. For nearly a decade, online misinformation and radicalization have remained significant threats to our democracy. The Center for Democracy and Technology found that “women of color candidates were twice as likely as other candidates to be targeted with or the subject of mis- and disinformation” during the 2020 Congressional election. A follow-up survey of women of color candidates found that many believed that the purpose of the targeted abuse was to push them to drop out of the race. Despite years of online abuse being a well-recognized societal problem, social media companies and policymakers are still allowing rampant harassment to continue without meaningful regulation.
Recent developments in AI technology bring new dangers to online harassment campaigns. For example, deepfake photos and videos are an increasingly common tactic for online trolls. Deepfake refers to the use of AI to create realistic images and videos, often by swapping someone’s face for another. While male politicians, including President Biden, have been targets of deepfake content, women are at heightened risk of being targeted with deepfake pornography intended to degrade and humiliate them. Although this technology has been around for some time, recent advances in ease and accessibility have enabled a sharp increase in the number of deepfake pornographic videos. Pornography makes up 98% of all deepfake videos. Pop star Taylor Swift was a recent high-profile victim of deepfake pornography, but surely will not be the last.
A multi-pronged issue such as gender-based political violence requires a bold and comprehensive response. Every year since 2020, a group of Democratic lawmakers have introduced a resolution that would recognize violence against women in politics as a global issue that requires increased dedicated research to understand and mitigate. Congress should pass the resolution to send a message that our nation’s leaders will not accept violence against women in politics as the norm. Policymakers should also consider passing legislation that explicitly prohibits and criminalizes political violence against women, which has already been done in Bolivia.
Policymakers must approach social media and AI policy with the unique needs of women (and others at elevated risk for harassment and violence) in mind. In January, Sen. Dick Durbin (D-IL) introduced the DEFIANCE Act, which would allow victims of deepfake pornography to sue publishers and distributors of the content. Ocasio-Cortez, herself a victim, has announced her support for the bill. Federal action is essential since only a handful of states have passed legislation regarding deepfakes. Holding perpetrators accountable and building safe online spaces can help prevent an exodus of women, especially women of color and LGBTQ+ folks, from public political life.