A tweet I published recently about Cnet.com, a large tech publication owned by Red Ventures, got quite a bit of press spotlight.
So much so that Cnet made a public statement this morning to explain the situation.
In a nutshell, a big site is, for the first time being open about the fact that they are using AI to help them generate content about mundane topics.
They still ensure it’s edited and fact-checked by a human (AI is notorious for making facts up), but they’re upfront about some of the content being machine-generated in the author byline.
This is a small-scale test for now, with only 75 articles published since last November but a test we all want to keep our eyes on nonetheless because it will probably determine the future of online publishing.
Many low-tier sites have been spamming low-quality AI content and seen big ups and big downs with updates. It’s the first time an internet household name has come out and openly shared the fact that part of the content on the page you’re currently reading is not written by a breathing creature.
Now all eyes turn to Google as they have been sending mixed signals about AI content, and some openly AI-generated pages have been doing very well on the search engine.
So while we’re all holding our breath, waiting for Google to play its cards, I thought it would be a good idea to dig a bit deeper to try to understand the current state of AI in household publications and try to answer some of the burning questions that are in everyone’s head right now.
1 – What RedVenture sites use AI Content?
First, I decided to investigate Red Venture’s other websites as it seemed to be the most likely place where you’d find more A-I bylined content.
And actually, they do:
To get a bit deeper into this story, I’m going to use the help of Originality.ai, an AI content detection tool I recently tested on this blog.
To give you my review in a nutshell, you could compare it to Ghostbuster tech.
It’s first gen, it’s hacked together, and often doesn’t work the way you intended it to. But if you check a batch of content from a writer and it flags 40-50%+ AI on several articles written, I’d be quite confident to say AI tech has possibly been used.
There is a good chance the tool will have triggered false positives below leading to wrong analysis so take the results with a grain of salt.
So I put this article by bankrate.com into originality.ai to get a baseline of what an “AI bylined article” looks like in the tool.
And here is what I got:
Just as the byline says, it seems like the AI output was heavily human-edited. Again, this tool is spotty and false positives are possible.
For example, this article was only flagged with 9% AI content.
And this one was flagged with 49%.
From my experience using and reviewing AI writers, this is not surprising. For some topics, the AI is prolific, while for some others, you end up writing most of the article yourself, leading to a higher % of originality score.
So now that we’ve established the baseline with the publishers that do disclose they’re using AI for their content let’s jump onto the question everyone’s secretly asking.
2 – Are other publishers using AI without disclosing it?
To try to hunt other big media using AI, I first followed some indics from this Reddit thread pointing at CNN for their dull, boring content that they couldn’t conceive would be written by a human.
So I went over to CNN’s website and started looking for the dullest content. I tested a bunch, like this one, on “To make money, avoid these common mistakes.”
And surprisingly, all of them came out as “original” and not AI created.
So here I was, with no lead, looking for a needle in a haystack.
And this is when a quote from my wife came to my head. “Sometimes you can use your hate to do good things.” Don’t ask.
But immediately, this made me think of my SEO arch nemesis, the Wario to my Mario: Forbes.com
My goal was to find the most boring topics they’ve covered recently, the kind of content a writer would rather relegate to AI, knowing well even the editor is probably not going to look that deep into it because it’s soooooo boring.
This is how I landed on… Internet provider reviews.
Specifically, Cox internet.
and when I put Forbes review in Originality.ai, here’s what came out:
This made me raise an eyebrow, but as I said earlier, Originality is NOT 100% reliable, and false positives can be frequent.
So we first opened the article and inspected it to see how it “flows.”
And needless to say, after reading the introduction, it flows like a conversation with my mother-in-law last Christmas.
But it could just be bad writing.
So I decided to dig several other similar internet provider reviews to see their scores. Here’s what I got:
So with this consistency of finding a high AI % on similar pages, I think we can make the assumption that there was some AI assistance for these kinds of posts.
My guess is that due to their highly templated nature and the chase for content uniqueness in order to try to rank better on Google, writers may be using an AI paraphrasing tool like Quillbot to rewrite sections that are common to these articles and then just insert the facts & data manually.
I didn’t dig deeper, but if I spent my days doing it, I’m 100% sure I could find similar examples of probable AI/automation used on many household web publisher’s websites that are not disclosed at all.
3 – How well are these articles doing in search?
Now the question that’s in all our minds is: Does Google do anything about it? let’s try to look at the data and answer this question.
Is increasing AI percentage correlated with lower Google traffic?
We took all the articles that were AI bylined by Cnet, ran them into originality to determine the % of AI used in them, and then crossed the data with Ahrefs estimated traffic data to try to answer this question. You can see the original data here.
Once you put all the data together, you get this graph:
An almost flat line that means one thing: Google does NOT seem to reduce organic traffic to a piece if you use more or less AI to help you write it.
Factors like keyword selection, competition, and other traditional SEO factors matter much more than the use of AI.
How does an AI-augmented writer compare to a non-AI-augmented writer?
For this test, I decided to head back to bankrate.com to mix things up a bit.
My goal is to get an idea of how competitive these AI articles are at bringing search traffic to the site against established writers for the brand.
For that, I used Ahrefs’ content explorer to find some of the most prolific writers for bankrate.com.
I decided to go for their #1 author: James royal who wrote 402 pages on the site and brings 1.5+ million monthly visitors.
Then I also went for Brian Baker who wrote 123 pages and generates 230,000+ monthly visitors.
I also decided to analyse their articles written in November and compare them against the traffic the AI content written that month generates.
Why November? Because we know SEO can take some time to kick in so I wanted to give some time for the articles to settle in the SERPs.
First let’s start with how many articles they each produced between November 1st and November 31st 2022
Surprisingly, AI did not get the most content written in November, I guess it’s still an experimental operation for now.
But what’s more interesting is how much traffic per article on average, they each yield to bankrate.com
That’s right, James Royal (Human writer #1) outperformed AI by 776% in November and Brian Baker (Human writer #2) outperformed it by 735%.
Are we observing Google’s E-E-A-T / Anti AI factors in this difference? Or did the bankrate team go for smaller keywords during this test which would explain the difference?
It’s hard to tell, but if I was the CEO of bankrate and saw this graph, I wouldn’t be firing my human writers just yet.
Things will be moving fast and there is no doubt using AI to augment your content creation abilities could become mandatory very soon if you want to stay competitive.