Black traffic: the corporate sabotage technique you've never heard of
AI-driven online disinformation methods designed to create fear and mistrust were perfected by nation-states. Now, they're coming to the business world.
In the cybersecurity world, deep-pocket, government-sponsored hacking groups typically pioneer the most advanced hacking techniques, which tend to trickle down over time to independent malicious hackers out to make a buck.
I think something like this is happening with disinformation as well.
Nation-states, especially Russia, China, and Iran, engage in a widespread policy of propaganda specifically aimed at making the citizens of other countries feel bad about their own nations.
Disinformation campaigns have other objectives as well. But one of their objectives is to get citizens to mistrust their own governments and hate or fear their fellow citizens.
Because using social networks, websites, bots, agents and influencers to create fear and distrust is so effective and easy to get away with, it’s only a matter of time before it becomes commonplace in the business world.
In fact, it’s already happening. Chinese authorities arrested suspects and charged them with launching a massive disinformation campaign starting in 2024 against the automotive division of Xiaomi. Such a practice is called “black traffic.”
The attackers, probably bankrolled by rival companies, used AI chatbots to create massive quantities of disparaging false rumors against Xiaomi Auto, which were posted on nearly 10,000 fake social media accounts. The rumors mostly painted Xiaomi cars as unsafe.
It’s not 100% certain that the “black traffic” against Xiaomi was perpetrated by rival companies, but I think it’s only a matter of time before companies engage in this kind of “competition” on a much wider scale.
In the past, “black traffic” against enterprises has mainly come from activists and fan groups.
In 2023, online activists launched a “black traffic” campaign against Target Corporation and its CEO, Brian Cornell. The activists were angry because Target was selling clothing from transgender designer Erik Carnell and featured an LGBTQ-themed Pride Month rollout, which also included other items that generated outrage, such as “tuck-friendly” swimsuits.
The campaign falsely alleged that Target was deliberately selling “satanic” clothes to kids. Attackers used AI image tools (like Midjourney) to make fake, highly realistic photos showing kids’ clothing with goat heads and pentagrams, alongside statues of Satan hanging on actual Target store racks. (Why anyone would think parents would buy “satanic” clothing for their kids is beyond me. I guess the devil made them do it?)
The resulting boycott temporarily cost Target $10 billion in market value.
Why “black traffic” campaigns are coming to the business world
It’s only a matter of time before major companies embrace “black traffic” to gain market advantages over rival companies.
You’re reading the free version of Machine Society. The paid version, which costs $5 per month or $50 per year, has full content. If you can, please support independent journalism in general, and this independent journalist in particular, by becoming a paid subscriber!
More from Mike

NEW THIS WEEK:
READ, LISTEN, FOLLOW, & SUBSCRIBE:
Machine Society, The Attachment Economy, Computerworld, Superintelligent, TWiT, blog, The Gastronomad Experience, Book, Bluesky, Reddit, Notes, Mastodon, Threads, X, Instagram, Flickr, Facebook, and Linkedin!



