Home / Case Studies / Internet Marketing Tools / GSA Toolset / The Increasing Problem Of Link Retention When Using Auto Generated Content!

The Increasing Problem Of Link Retention When Using Auto Generated Content!

As A reader of my blog, you are able to save 15% on the retail price of any GSA tool! To get this discount use any of the links below or in my posts then use the coupon code “shaunmarrs” on the checkout page.

GSA Search Engine Ranker
GSA Captcha Breaker
GSA Content Generator
GSA Platform Identifier
GSA Proxy Scraper

In this post, I will be revealing the results of my case study focusing on the retention rate of contextual links built with GSA Search Engine Ranker using auto-generated content. For clarification, I personally define auto-generated content as content that has been created using software that scrapes the internet for articles and then essentially mashes them togeather into a spun article.

When users are building links on mass with auto-generated content loaded into their tools there has always been a percentage of unavoidable link loss. In the past, I have always tried to account for this and over compensate accordingly during my link building phase as a way preempt and counter the problem.

During the middle of 2016 I noticed that the link loss of my campaigns that were using auto generated content had begun to spiral out of control. In early December 2016 I decided to start this case study in an attempt to gather as much data as possible to enable me to make better informed decisions for my campaigns while moving forward.

The Testing Process

I decided to create five contextual projects in GSA Search Engine Ranker to gather my data, after loading each project with auto-generated spun articles I allocated each project their own “Maximum post per account” setting in the project options tab. I did this as I initially thought the webmasters of the target domains may just be checking the number of submissions per account on their domains and deleting accounts that were making over say 10 submissions in a minute as this is obvious automated behavior.

I decided on the submission limits of 1, 10, 25, 50 and 100 for each project as I felt this offered me a broad spectrum of submissions per account in an attempt to try and identify any patterns linking the number of submission per account to the number of links lost for each of the projects.

I then set the projects to active and left them to run overnight, all projects were pulling their targets from the exact same verified list, using the same catch-all email and semi-dedicated proxy batch. Come morning, each project had created a verified URL yield of between 7700-9000 verified URLs I then simply left them to sit for the next four weeks without touching the link batches.

My Expectations

Something has definitely changed recently with the link retention on the domains GSA Search Engine Ranker is able to post to when using auto-generated content, as the symptom is not engine specific it is not a specific content management system releasing a patch with better auto detection of auto-generated content.

I have already touched on my theory regarding the number of submissions per account setting in the project options tab as shown in the screenshot below having a possible effect.

GSA Search Engine Ranker submissions per account

I personally believe that the higher the number of posts per account the higher the link loss will be as in theory it is easier for webmasters to single the accounts out and delete them and all of their verified submissions.

The Results Are In!

Total link loss from contextual links created using GSA Search Engine Ranker

As you can see from the results above, for whatever reason all projects in the case study ended up losing least 75% of their initial links over the four-week test period with one project peaking with 81% of its links being lost.

The number of submissions per account on each domain does not seem to have an effect on the link loss either as the overall percentage for each project is so similar. This leads me to believe it is definitely something to do with the use of auto-generated content on the projects.

What I Make Of It

I believe that auto-generated content does still have a place in black hat search engine optimization, I still use it when using automated tools to create web 2.0 pages and none of these domains are suffering from this problem. I also use auto-generated content when completing blog comment, guestbook and image comment blasts and I’m yet to see and problems in that field.

I have a similar case study to this one currently running that uses the exact same projects, proxies, and target list but uses human spun content rather than automatically generated content. The pages are close to their four-week point and I will release the post in my case study section once I have the required data to see if there is any noticeable difference between the two.

It is easy for anyone to duplicate this test for themselves, simply create a project with auto-generated content and have it submit URLs to your contextual article targets. Make sure that automatic re-verification is turned off for the project and then after a number of weeks manually open the projects verified URLs by right-clicking the project, selecting Show URLs and then selecting verified. Then simply tell the project to reverify the URLs using the verify button at the bottom of the window and you will be able to see how many of your links have died over the time period that you left the project to sit.

Check Also

607 Links Per Minute With Free Public Proxies!

As long as I can remember people, including myself, have always advised users to only …

Join The Mailing List!