The word "spam," one of the most popular words the Internet introduced to our lives, has lost its thrown. Now, the new “digital plague” is the term “slop.” You should get used to it because it likely won't be the first, or last, time you hear it.
What's slop? The term refers to content created automatically by generative AI tools. It’s not just any type of content created with them, but rather automated content created without any human labor or supervision. This type of content is only intended to drive monetization, such as by increasing visits to a website or inflating follower figures.
Unlike popular AI chatbots, slop isn't not interactive material and doesn't pretend to respond to anyone’s needs. Its sole purpose is to look like human content to capture traffic and generate advertising revenue.
Why is this a problem? The issue at hand is that no one wants to consume slop. However, the digital economy incentivizes the mass production of low-quality content, just like spam. With generative AI, creating large amounts of text and images is easy and cost-effective, even if they lack quality and usefulness.
The Guardian provides some examples:
- A Microsoft Travel article suggesting visiting a food bank as a tourist attraction in Ottawa.
- Books about mushrooms with potentially harmful advice being sold on Amazon.
- Viral memes on Facebook (which are particularly problematic on this social media network), like the one featuring a shrimp-bodied Jesus Christ.
This content is often laughable or simply a waste of time. Slop can be frustrating to sift through, making it difficult to find useful information. Moreover, this type of content also undermines trust in all types of media, including legitimate content, much like how fake images have done in the past.
The response from tech giants. Technology companies have attempted to address the issue of slop with mixed results. For example, Meta is requiring users to tag AI-generated content. TikTok, for its part, is automating the tagging process for AI content, while Google is implementing automatic summaries for its searches. However, these efforts may not effectively solve the problem and could make it even more difficult to differentiate between real and artificial content.
Simon Willison, the developer who coined the term “slop,” emphasized the importance of recognizing and labeling this threat, according to a report from Tech Times. “Before the term ‘spam’ entered general use it wasn’t necessarily clear to everyone that unwanted marketing messages were a bad way to behave. I’m hoping ‘slop’ has the same impact–it can make it clear to people that generating and publishing unreviewed AI-generated content is bad behavior,” Willison told The Guardian. He also warned that eradicating slop may prove to be more complicated than dealing with spam.
Zombie Internet. The Internet is like a chaotic ecosystem where bots, inactive accounts, and real people coexist. Sometimes it’s hard to tell what’s real and what’s slop, which relates to the Dead Internet theory (an online conspiracy theory that asserts that the Internet has been largely taken over by AI). Jason Koebler from 404 Media refers to it as the “Zombie Internet.” He describes it as “a mix of bots, humans, and accounts that were once humans but aren’t anymore mix together to form a disastrous website where there is little social connection at all.”
That's the way things are, at least for now.
Image | Xataka using Midjourney
View 0 comments