Home / Tutorials / Internet Marketing Tools / GSA Toolset / Essential GSA Search Engine Ranker Maintenance To Keep Your Rig Running Smoothly!

Essential GSA Search Engine Ranker Maintenance To Keep Your Rig Running Smoothly!

GSA Search Engine Ranker, like many things requires maintaining to prevent degradation of the performance of the tool. The level of maintenance required varies depending on what you are using the tool for as well as your own personal set up. Variables such as if you are using a premium list, catch-all emails or an indexing service all come into play.

In this GSA SER maintenance tutorial, I will be covering a few different maintenance tasks that I feel are essential as well as explain why I feel you should be completing them and what situations you may need to complete them.

Daily Servicing Tasks

To begin with, we will be looking at the tasks that I suggest become part of your daily scheduled maintenance for the tool.

Software Updates

Starting with a basic one, I consider updating GSA Captcha Breaker as soon as an update is available an essential task. Updates to this tool are usually based around adding newly supported captcha types, improving the solve rate of currently supported captchas or improving the speed of the tool. All three of these mean better link output from GSA Search Engine Ranker as it will in theory at least have a higher success rate with a larger captcha pool as well as run quicker.

If you choose to use an optical character recognition or human solved captcha service as your secondary captcha service then updating GSA Captcha Breaker will help lower your thread count with an OCR service as the tool should be failing less on captchas as well as saving you money on wasted captcha credits from your human solved captcha service.

With regards to updating GSA Search Engine Ranker I have two trains of thought. The first is based around updating as soon as the update is released. In theory, the update has been released to either add a new feature, update scripts or fix a bug, as scripts do require maintenance by Sven to keep them online when a content management system updates and update of this type will help increase your link count.

My second train of thought is to wait a few days after an update has been pushed and check the forums for bug reports from other users related to the update. There have been a number of times where an update has accidently broken something else and then a hot fix has had to be released to correct this mistake. If you go by the first train of thought and update instantly and there is a bug that causes a problem with the main link types you build you could in theory, lose that engine or engines until the hot fix is pushed.

A few month back I was kicking out millions of links per day so to minimise the risk of losing engines I would use the second method for keeping GSA Search Engine Ranker up to date but I have recently downsized my SER operation to minimise costs while developing new methods and my link output is tiny in comparison to what it used to be so I have now started to update the tool as soon as I see an update is available. It is down to your own personal preference what method you use to update your SER.

One thing I will quickly add is that SER often has many updates released in quick succession and you may end up in a situation where an update is released and you wait three days for any bug reports to appear on the forum but another update is pushed on the second day so your wait time is reset, two days later another update is pushed meaning if you wait three days to wait for bug reports before updating you are potentially waiting a week before running the update.

Rebooting Your VPS or Server

As will all computers sometimes rebooting them is just what they need to clear any caches or processes that have randomly sprung up and are hogging resources. I will list this as daily servicing as I am currently running a reboot daily but in the past, I have completed this as a weekly servicing task. In my opinion, the time frame in-between rebooting your VPS or dedicated server depends on what you are doing with the tool.

If you are running a smaller operation with a low daily link build then it is easier to reboot everyday where as if you are for example trying to build out a massive tier three campaign to try and get your tier one and tier two links indexed then for me personally getting that built would take priority over rebooting my VPS and it may take a few days or be dropped to weekly servicing.

On a side note, I have published this guide on how I choose, set up and optimize my VPS machines to maximize the output from SER. Although it is not exactly servicing I though I would add the link into this post as it is good practice to go over the steps on a fresh VPS or server.

Verify Or Re-Verify Contextual Links.

Depending on the size of your contextual link tiers and the number of daily links they are required to put out it may be beneficial for you to turn verification completely off for the project. This will allow the projects to focus on nothing but submitting as many links as possible to reach your required link count as fast as possible without going into verification mode every so often and wasting time.

If you do choose to use this process then I highly recommend you change these GSA Search Engine Ranker projects from the active status to the acive – verify only status once a day. This will force the projects to try to verify all of its submissions at once minimizing the downtime of the projects.

If you are using this method then as soon as you have completed your verification run I highly recommend you re-verify your contextual links that have been built in the past. To do this right click the required project or projects, select show URLs and then select verified. You will then be presented with the verified URL window where you will press the verify button at the bottom and then select all as shown in the screenshot below.

GSA Search Engine Ranker re verification of project urls

Once this has completed you will have the option to purge all links from the project what both failed re-verification and timed out, purge only the links that failed, purge only the links that timed out or purge none of the links as shown in the screenshot below.

GSA Search Engine Ranker link purge window.

I recommend you do this as a daily servicing as these are usually the links in your link building pyramids that you will be building additional links to. If you have, for example, 1000 links in your tier one and 500 of them are deleted instantly but you do not reverify the links and let your tier two automatically build links then you are wasting system resources building new links to 500 links that have been deleted.

Reset Data On Your Tier Three Blaster Projects

In the past, I have used massive tier three projects to build out huge link profiles of blog comments, guestbooks, image comments and trackbacks to my tier two projects. The goal of this was to get the tier three projects to index my tier two project and they would in turn index and pass link juice onto my tier one projects.

The problem is where you may set link limits per target URL that are quickly met by the project effectively stopping additional links being built on that domain. Running the reset data option on the project will remove the target URL history data along with a number of other settings for the project letting it start over from scratch.

Before completing this on live projects I recommend that you run it on a burner projects set up to build links to a dummy URL such as http://resetdatatest.com to see if this offers any improvement to your link per minute count with the VPS or server you are running on as it may not have any effect on lower specced systems due to system resource limits being met by GSA Search Engine Ranker effectively giving you a hard cap of links per minute you are able to build with the tool.

To run the reset data option simply select the project or projects you wish to run the process on, set them to inactive and complete the navigation shown in the screenshot below.

GSA Search Engine Ranker Reset Project Data Tool

Add New Emails Into Projects

If you are not currently using a catch-all email service then depending what you are doing with your projects you will have to refill their email accounts on a regular basis. I have previously used this service from Fiverr, the accounts provided have a very high retention rate for SER usage but at $5 ($5.50 when you add on the current processing fees) per 200 against unlimited catch-all accounts starting from only $6.50 at current price rates it is a no brainer to move over to a catch-all service.

[Update] I have released this post where I explain why I use a catch-all email service and how making the switch reduced my email account cost by over 90% while increasing my daily link output.

I currently use catch-all email services so have no need to have this task on my scheduled maintenance but if you have a large number of contextual projects then you will need to be replacing your email accounts on a daily basis as each account can only be used once per contextual domain.

When you add your new email accounts to the project email window be sure to click the test emails button on the right-hand side of the window. There is nothing worse than adding a fresh batch of email accounts to your project, letting it run then realizing a few days later that all of the email accounts had been deleted before you even added them to the project.

Filtering Your Verified List

It doesn’t matter if you are scraping your own list or using a premium service  I consider filtering your list essential maintenance. There are a number of ways you can run the process including doing the process daily, weekly or running a constant filter while you push the filtered URLs to your verified list as you need them.

I explain the full process of how to filter your lists in this post and share a case study here where I take a list from 75.77 links per minute then filter it bringing it up to 763.33 links per minute.

Weekly Servicing Tasks

We are now moving to what I suggest becomes part of your weekly scheduled maintenance for GSA Search Engine Ranker, however, as you may have guessed already, depending on what you are doing with the tool can affect the time frame you would complete the tasks.

Project Backups

Backing up your project data should be of the utmost importance to you! This is only one example I saved to my Evernote notebook of a user who had not backed up their GSA SER projects for months, there was a problem with an update that had been pushed out (see the importance of the second strategy above of waiting a few days before updating now?) and it prevented users from opening their SER instances once updated.

The user could have lost months of project data due to not taking the time to complete a simple servicing task.

I have always taken weekly backups but it is down to personal preference of how you incorporate them but I suggest they be on either your daily or weekly servicing schedules rather than monthly!

To do a full project back up simply press the stop button on SER, wait for your active thread count in the bottom left of the tool to drop to zero, right click on the project pane and complete the navigation of the screenshot below and once backed up I save them to dropbox.

GSA Search Engine Ranker Full Project Backup.

Verified List Backups

Similar to project backups, I highly recommend that as a minimum you back up the folders containing verified list data so you have something to go back to if you current ones are lost for whatever reason. It doesn’t matter if you are using my method to filter your projects as explained here, using a premium list for targets or scraping your own list with GSA Search Engine Ranker, backing your lists up should be part of your weekly servicing schedule then save them to dropbox as you never know when you will need them next.

To back up your folders in SER simply press the options button, go to the advanced tab and complete the navigation below.

GSA Search Engine Ranker back up list folders.

Remove Duplicates If Self Scraping Targets

With both footprint scraping and link extraction, there will be a large amount of duplicate data. Unfortunately, this is unavoidable and the only thing we are able to do is remove this duplicate data from our folders in an attempt to minimize the number of total targets your projects have to process.

For example, say you use list extraction and pull 10,000 target URLs to process but 6,000 of them are duplicates. You can either process the full 10,000 and waste resources or you can purge the 6,000 duplicates so GSA Search Engine Ranker only has to process the 4,000 unique targets.

Both your scraping method and system resources used for scraping will dictate how often you have to do this. I have put the task as weekly maintenance for this tutorial but in the past, I have completed this process multiple times per day when I used to build my own lists gathering millions of targets per day to process.

To remove duplicates, click the options button, select the advanced tab and complete the following navigation.

GSA Search Engine Ranker Remove Duplicates

I highly recommend you filter your lists using the process I explain here as this removes the requirement to complete this servicing as the filtering process does it for you and provides you with a healthier overall list.

Monthly Servicing Tasks

I currently have no servicing that I complete on a monthly basis but if I find some I will add it here.

Wrapping It Up

I hope this has helped a few readers understand how to best maintain their tools and help them run a faster, safer link building program.

Check Also

How To Filter Out Useless Footprints To Massively Improve Your Target Scraping Speed!

Over the past few weeks, I have seen an increasing number of people on the …

Join The Mailing List!