Ted Cruz Takes on NSFW Deepfake Videos With the 'Take It Down' Act

Very little has been done to address the prevalence of non-consensual NSFW media that machine learning and generative AI technologies can produce, despite the fact that Capitol Hill lawmakers have published a roadmap for successfully and morally addressing these emerging issues. No one is safe from technology like Deepfakes, Inpainting, or Image Generators, which enable inevitable creeps to turn them into pornstars, whether they are AOC, Taylor Swift, or high school students.

According to a 2023 study conducted by Home Security Heroes, a business that protects consumers from identity theft, 99% of deepfake movies on the Internet are of women, and 98% of all deepfake films are pornographic. The number of NSFW deepfake movies and photographs that feature children is currently unknown. But in recent years, reports of phoney photos—including phoney nudities—circulating high schools have garnered media attention.

The 'Take it Down' Act

Lawmakers recently submitted a bipartisan initiative called the "Take It Down" Act to address the growing issue of AI-generated NSFW content. It employs commonsense measures to safeguard individuals, particularly children, from this crime of the digital age. A one-page explanation is available here.

It is reasonable to suspect anti-porn laws, particularly ones ostensibly designed to "protect children," which frequently use children as a pretext to outlaw all forms of sex work and porn. This law, however, is a rare exception thus far. The alarming thing about this bill is not only that there are no hidden risks for adults, queer people, or sex workers, but also who is driving it.

Ted Cruz.Yes, that one.Ted Cruz, a hard-core conservative who is highly hated and who abandons his constituents in times of crisis, is actually doing something positive.

Let's examine the "Take It Down" Act in more detail and consider its implications for the future of AI porn in order to gain a better understanding of this perplexing phenomenon.

What is the 'Take It Down' Act trying to address?

For those who are unfamiliar with machine learning (ML) or generative artificial intelligence (Generative AI), which there appear to be many of, here is a brief overview of the technology and how Non-Consensual Intimate Imagery (NCII) has been created using it.

Contrary to widespread assumption, Skynet or any other malevolent pop sci-fi villain is not what we now refer to as AI. Consider it more akin to the text prediction feature that has been on every phone for years.

Machine learning models are loaded with information mined from the Internet to provide the user with what they have requested in the prompt. Similar to predictive text, as you go around the block, your phone will figure out what you are most likely to type next and that you aren't writing "There's nowhere to ducking lark." As it is used, more and more data is collected, and it gets better.

When Generative AI was first made accessible to the general public in 2022, the quality of the photographs produced was poor. Problems included strange-looking hands, eyes that were moving in all directions, and a face replacement that appeared to be wearing a Halloween mask.

However, a malicious individual may now create reasonably convincing fake porn from photos of whomever they want, whether they are a random stranger or a star, in less than a minute.

Even though many programs, particularly well-known ones like Midjourney, have safeguards, fast instruction can bypass them. Additionally, many free programs and websites highlight their lack of security.

To be clear, porn itself is not the problem here. The difficulty of creating false, pornographic photos of people and the resistance of social media platforms like Snapchat and X (formerly Twitter) to remove such images are the problems. Cruz is deeply entangled in this because of his hesitation, which is actually how it all began.

Elliston Berry, a high school student, fell victim to this technology late last year. She was just 14 when the incident occurred, and when she woke up, she discovered that her school was sharing phoney naked photos of her that had been uploaded to Snapchat.

Berry and her mother, Ana McAdams, were so appalled and terrified that they made a valiant effort to persuade Snapchat to remove the pictures. The removal of the photographs took almost eight months due to the bureaucratic response they received. Cruz claims that Snapchat only took this action after the mother and daughter approached him for assistance, and he gave his team instructions to contact Snapchat right away.

Legislative solutions for NSFW deepfakes

As a father of teenage girls himself, Cruz has stated that Elliston Berry's terrible experience served as the impetus for this law. When it comes to the widespread false pornography that has been made about Democratic Representative Alexandria Ocasio-Cortez in recent years, he has been relatively passive about the subject, but whatever.

Cruz's efforts to combat the evil of non-consensual fake porn are not unique. Democratic Senator Dick Durbin was pushing a similar bipartisan bill known as "The Defiance" Act, which would allow citizens to sue the producers and distributors of this material. It should be mentioned that this approach to civil litigation is not the best. Court proceedings are costly and time-consuming, reveal private information, and place the burden of proof on the victim.

Republican Senator Cynthia Lummis opposed that bill, claiming it "stifled American technological innovation" and was "overly broad in scope," according to CNBC.

The Take It Down Act specifically targets businesses like Meta, Snapchat, and X, even though this is an obvious case of preventing Big Tech from acting responsibly. The proposed bill would hold platforms and websites accountable for hosting the content and mandate that they create a mechanism that would enable victims to report false porn and have it removed within 48 hours. Additionally, the bill urges them to do everything within their power to eliminate any copies of the picture or video, even those circulated in private groups.

The Federal Trade Commission (FTC), which oversees consumer protection, would implement these measures. Additionally, the FTC has the authority to file criminal charges against the people who created these photographs.

Cruz stated in a statement to CNN that "our bill will protect and empower all victims of this heinous crime by creating a level playing field at the federal level and putting the responsibility on websites to have in place procedures to remove these images."

Cruz made the point during his press conference on June 18th of this year that politicians and celebrities have the authority to have their phoney nudities removed from the Internet. Most ordinary people, especially children, cannot access that power.

"Everyone knows that Taylor Swift experienced this," Cruz remarked. Additionally, you can have the pictures taken down if you're a well-known music star. However, Big Tech disregards your cries for assistance whether you're just a teenager in Texas, New Jersey, or any other place.

Copyright laws, like the current DMCA statute, serve as the model for the Take It Down Act.

Cruz remarked, "The Lion King can be taken down very soon if you put it up. However, the Big Tech companies pretend they don't know how to handle it if you post non-consensual personal photos. This is intended to make taking it down mandatory. Similar to the copyright side, we will observe instant compliance once this is passed, which will likely happen.

Dawn Hawkins, the CEO of the National Center on Sexual Exploitation, said during the same press conference that anyone may publish anything on sites like Reddit, X, and Pornhub without facing any repercussions.

While both Reddit and Pornhub have various problems, it should be noted that both sites prohibited Deepfake porn in 2018. Conversely, X hasn't done much to combat AI porn fakes.

Despite Hawkins' criticism, not all porn is intended to be targeted by this bill. Only AI-generated NCII is the subject of attention.

This bill's ability to refrain from infringing upon legitimate free expression is an excellent feature. "The bill is narrowly tailored to criminalize knowingly publishing NCII without chilling lawful speech," the bill's summary states. By requiring computer-generated NCII to pass the "reasonable person" threshold for seeming to resemble an individual genuinely, the bill complies with current First Amendment jurisprudence.

Cruz responded, "I hope they do the right thing and recognize this is the right solution," when asked how businesses like X or TikTok would react. That's what they ought to have been doing all along.

Not like those other anti-porn bills

The depressing situation with AI is reflected in the fact that the "Take It Down" Act is even required. Generative artificial intelligence has unlocked a Pandora's box that cannot be closed.

The "Take It Down" Act, as currently drafted, would safeguard individuals who are harmed by non-consensual AI photographs and make those who profit and benefit from them liable. Furthermore, the "Take It Down" Act has no evil provisions that would benefit wealthy creeps or make life more difficult for adults who enjoy erotica, in contrast to many of the anti-porn laws that conservatives have attempted to enact.

Ted Cruz is the author of this good bill. Crazy, huh?