Home / Tutorials / Internet Marketing Tools / GSA Toolset / A Little Role Play To Explain Why You Should Filter Your GSA SER Verified Lists.

A Little Role Play To Explain Why You Should Filter Your GSA SER Verified Lists.

As A reader of my blog, you are able to save 15% on the retail price of any GSA tool! To get this discount use any of the links below or in my posts then use the coupon code “shaunmarrs” on the checkout page.

GSA Search Engine Ranker
GSA Captcha Breaker
GSA Content Generator
GSA Platform Identifier
GSA Proxy Scraper

This post is part of my GSA Search Engine Ranker list filtering series and should be read in the following order.

This Post

How To Filter Your List Correctly

Contextual Filtered List Case StudyNon-Contextual Filtered List Case Study

In this post, I will be going over some examples of why it is important to filter your list.

Time For A Little Role Play For A Better Understanding Of What Is Actually Happening In The Background

Initially I want you to imagine you are a webmaster, you own five domains running a contextual article content management system, in this example, it is the Drupal Blog engine from GSA Search Engine Ranker. For whatever reason, all Five of your domains have ended up on a GSA SER premium list, this particular list has 100 total users for this example. Each day the number of automated posts to your domains keeps on increasing resulting in higher bandwidth and hardware requirements for your hosting company. It doesn’t take long for them to email you saying they require additional money from you to meet the new demand for your domains held on their server.

I won’t go into much detail on them here but there are a number of different measures you as a webmaster can take to prevent SER from being able to post to your domain. Initially in an attempt to lower the bandwidth requirement of your domains and hopefully stop your hosting company wanting more money you assess the situation and realize you have no reason to keep two of the domains online so you disable them.

Initially, things are good but over the following week, the three domains you still have online are receiving an ever increasing number of automated posts and soon enough your hosting company again emails you asking for more money.

For whatever reason, you decide you want to keep these sites online and choose to take action! You decide to edit your content management system slightly on one domain meaning SER is unable to post to it with its default engines, for the other two domains you decide that upgradeing the captcha type being used is the best way to go. You upgrade one of the domains captcha service to ReCaptcha and the other domain captcha service to Mollom with this second domain you also choose to change its outbound link type to no follow.

In theory, you have made it harder for SER to get verified URLs from your three remaining domains while removing its ability completely with the two you have offline.

Time To Change Roles And Move To The Dark Side

 Now, I want you to imagine you are one of the one hundred users of the premium list and you now have these five domains in your verified folder. So many users have the attitude that SER has made a verified post on the target at least once so there is always a chance that it could post to it again and chose to never complete any maintenance on their verified folder.

Technically this is possible but in my opinion, it is bad practice for the way most of us use SER. If you are anything like me and use GSA Search Engine Ranker for mass automated link building then there is absolutely no reason not to filter your verified lists and keep the process as part of your regular maintenance of the tool.

I know there are, or at least used to be some people out there who would use SER with a quality over quantity approach, if you are one of these people then this process may not apply to you.

Now, with that out of the way lets say 50 people out of the 100 who are using the premium list have the attitude I described above and never touch their verified list once a target has been added to it. This leaves all five of the domains on there forever.

30 of the remaining 50 people who use the list run SER maintenance with the built-in clean up tool in the advanced options tab as a way to clean their list. Although this is better than nothing I still prefer my system. To my understanding this process will first alive checks the targets on your lists, if the targets are alive it then proceeds to check if the target is running a content management system that is usable by SER. In our example, the built-in clean up tool detects that two of the domains are offline from its alive check.

To my knowledge, when SER checks to see if a target has a usable content management system it simply searches for any present footprints on the target that match any of its engines. Even though one of the domains had its content management system edited its footprints remain meaning SER is able to detect a usable content management system for the remaining three domains and keeps them in the verified folder leaving three of the five domains.

The remaining 20 people follow my process. Now depending on their personal choices they are each left with either zero, one or two domains left on their verified list.

The filtering process gets rid of the two offline domains without question as well as the one with the modified content management system as it has prevented the GSA Search Engine Ranker script from posting to it. This leaves the two domains that had their captchas changed.

GSA Captcha Breaker Mollom Results

As you can see, at the time of writing the Mollom captcha type a 54% solve rate with GSA Captcha Breaker meaning there is a good chance SER will reverify this target and produce a verified URL for it.

However, if you remember earlier in the example the webmaster changed its link type from do follow to no follow, in this instance the user does not want to keep any no follow targets in their verified list so they drop it. However, some users will want to keep no follow targets or they may want to segment their do follow and no follow verified targets into different folders for later use. In this instance, this target would be now be kept.

GSA Captcha Breaker ReCaptcha Success Rates
The final domain in the example is the one protected by ReCaptcha. As you can see, at the time of writing GSA Captcha Breaker claims a 12% solve rate for ReCaptcha, Although it is an excellent tool, I believe the actual solve rate of the ReCaptcha group is much lower due to the various types of ReCaptcha out there these days so for this example lets say that GSA CB is unable to solve the captcha meaning that a user with a basic set up GSA SER will not be able to verify this target.

However, if you remember earlier in the example we said that all of the domains are running a contextual article engine and in this instance, the user decides that they value contextual article targets enough to spend some human solved captcha credits on trying to keep these on your list. You give your contextual article filter project the ability to send ReCaptchas to your human captcha solving service and GSA SER is now able to verify the target for you. Upon inspection, you discover it is a do follow link and you save it back to your verified folder.

Time To Multiply The Link Count

Now lets increase these numbers up to try and get a better understanding of what is happening on a realistic scale, instead of just 5 domains lets pretend it was 50,000.

This means the first 50 people still have 50,000 targets in their verified folder. These people are not using a human solved captcha service so they are not able to use the ReCaptcha protected domains. This means they have only 10,000 targets in their verified folder that they are able to post to with them all producing no follow. That leaves 40,000 unusable targets with 20,000 of them being offline completely that will never produce a verified link.

GSA Search Engine Ranker 50 Users Pie Chart Breakdown Of Link Stats

The group of 30 people who used the built in clean-up tool are left with 30,000 targets as it detected and removed the offline domains. This group also have no human solved captcha service so have 10,000 usable targets that all produce no follow links with 20,000 unusable targets sitting in the verified folder.

GSA SER 30 Users Verified List Stats

Next, we have the group of 20 people who use my filtering method, depending on their personal choice they have either of the following targets counts remaining but all outcomes have 0 unusable targets!

  1. 0 targets.
  2. 10,000 no follow targets from the Mollom domain.GSA Search Engine Ranker 20 Users With Mollom Targets Only
  3. 10,000 do follow targets from the ReCaptcha domain.GSA SER 20 Users Using ReCaptcha Targets
  4. 20,000 links made up of the two previous options.GSA Search Engine Ranker 20 Users Using Mollom And ReCaptcha Targets

Time To Build Some Links!

Now, to further this example, say these user groups have set up a basic three-tier link pyramid to try and rank one of their pages. As mentioned earlier, all of the targets are running the content management system know as the Drupal – Blog engine in GSA Search Engine Ranker. All three user groups now set their tiered link building pyramid up as in the diagram below.

Basic Tiered Link Building Pyramid Of Three Tiers

The group of 50 people set their projects up and press start, they are expecting to see lighting fast link building but little do they know 80% of their verified targets are useless to them.

In reality, there is no exact way to transfer this directly into wasted time and system resources as SER will take different amounts of time and resources to the conclusions of a target being offline, having an edited content management system, having hard to solve captchas or being able to actually make a submission. There are other variables to take into account too such as the speed of the target website, time taken out from SER to verify links and such but as you can see this group of users are wasting so much of their time and resources.

The group of 30 people set their campaign up and press start, they are confident their list is clean and are excited to see what happens.

Then finally we have the group of 20 people who filtered their list with my method, we are going to disregard the people who would have dropped all the links and have been left with zero targets in their verified folder but in some situations this could be the correct choice to make.

So we will be sticking with the guys who decided to keep some targets. Any of the three breakdowns from above of 10,000 Molloms, 10,000 ReCaptcha or the 20,000 made up of both. The activate the project and…

 A Few Additional Factors

Now all of this is made up of example stats to try and show you the kind of things happening in the background that you have little control over other than to actually filter these domains out to keep your speed up.

There are a number of other steps webmasters can take on top of these that I did not want to go into the post for various reasons but one of them is adding human moderation before approving the submission. I purposefully left this out of all examples to enable me to stick with the links per minute metric as I felt it was an easier metric to track across the three user groups.

Essentially, if a webmaster does take this step then you will still be able to submit to the target but the link will rarely become verified, this is another thing that the built in clean up tool is unable to detect but my list filtering method will.

That brings me to the end of this tutorial, I know it is a long read for the point I am trying to get across but I hope it has helped you have a better understanding of the importance of filtering your verified lists.

Check Also

How To Build Your Own Auto Accept List With GSA Search Engine Ranker!

As A reader of my blog, you are able to save 15% on the retail …

Join The Mailing List!