Anyone who has been involved in any black hat search engine optimization forum over the past few years will probably have seen the term “kitchen sink” used by users when discussing tiered link building. Usually, it’s used to refer to the bottom tier of the campaign pyramid but the term offers no real information on what that project is actually building. I absolutely hate this term and honestly believe it causes more harm than good!
I hate the term so much that I decided to do a case study to find out what link types they actually create and how useful they could potentially be with my current thinking and methods. In my opinion that the term “kitchen sink” came from the widely used term “everything but the kitchen sink” and is thus used to define projects set up in tools such as GSA Search Engine Ranker that have all engines, platforms, and link types enabled essentially giving the tool free reign to build everything it is capable of but this definition does variry user to user.
In the past, I have assisted countless users over on the GSA Forum via private messages and via Team Viewer sessions where the user is complaining of either low links per minute or low indexing rates and the main cause for this ends up being a “kitchen sink” project as a tier three.
At the time of writing I am still recovering from the Penguin 4 rollout and developing my new methods, I have a few keywords climbing nicely and it seems I have also managed to correct one of the domains I believe to have been affected by the rollout. All this being said, my opinions shared in this post are accurate at the time of writing but may change in the future.
How Do I Plan To Run The Case Study
I plan to create a brand new project and enable every engine, platform and link type meaning the tool will have free reign over its capabilities. It will use its internal coding to prioritize its link types and create links as it decides.
All of the users I have helped who have been using a “kitchen sink” tier in their campaign have either been pulling their targets directly from an unfiltered premium list or letting the tool scrape for itself. If you are using either of these methods then I highly recommend you read my tutorial on filtering your list here as it can provide a massive link output increase. For this test I will use a premium list to provide targets to the project as self scraping takes much longer.
I plan to leave the project set to active for two full days, although in my usual tier three campeigns I have automatic link verification turned off in the project options, for this test I will turn it on to give me some data to judge the project by, re-verification will be turned off for the project to prevent the tool from removing links that removed over time as this test is focusing on the link yield of a kitchen sink project, not link retention.
Time For A Prediction!
From what I have seen when helping users using this method GSA Search Engine Ranker seems to prioritize the weaker link types and create a large amount of them rather than anything of real value. As I am pulling targets from an unfiltered list I expect the link yield to be very low and be dominated by no follow links.
The Results Are In!
As you can see, my kitchen sink project only managed to build 14973 URLs over two full days! That means the project was running at just over 5 links per minute, to put things in perspective I had this tier three project running at 763 links per minute for a case study a few weeks back. Running that project would have generated the same link yield in around 20 minutes while having a much better effect on the overall campaign as it is specifically designed for indexing.
The screenshot above shows what GSA Search Engine Ranker creates for the indexer platform, the project created 6360 similar links to this accounting for just over 42% of the total link yield. The page is very thin with minimal content, no media and does not have a high chance to be in the Google index by default such as a blog comment or guestbook post, it offers no real assistance for directing the Google crawlers through your tiers unless you use premium indexing credits on the links and in my opinion, it is a totally useless link.
Above is a screenshot of a blog comment produced from the project, essentially the page can be broken down into two different parts. The upper part is the first part of the page, it is an article created by the webmaster or publisher on the domain. It is usually a text-based with some form of media as more often than not is already indexed in Google.
Then we have part two of the page at the bottom, this is the comment section. This is the area we are able to add our link, although the tool does offer the ability to add relevant content here I usually just used a generic “great blog post” type reply as the amount of unrelated content made up of the webmasters post and other comments on the page mean our post will have little to no over-all relevance anyway.
As the post is usually already indexed in Google our little comment attaches to the bottom of the post and by default is also indexed by Google. This means that if the link is do follow then it can be used to direct search engine spiders through our tiers helping to index out links.
The screenshot above shows that the project created 3087 total blog comment links, almost 21% of the projects total link yield with only 150 of them being do follow. This means that in my opinion only 150 of the links created in the blog comment platform are usable on live projects reducing the percentage of usable links down to only 1%.
The screenshot above shows the type of page provided by the exploit platform, I am not a coder or network administrator and have no idea what this data is but it is not niche relevant or already held in the Google index like a blog comment so I personally see no reason to be building them. The project created 1858 of these exploit pages meaning they made up just over 12% of the total link yield.
Above is a screenshot of a page created by the article platform when told to use the article link type in the project options tab. You can read my post on setting the article platforms up to generate either articles or profiles here to help increase your chances of the tool producing pages like the one in the above screenshot rather than a profile.
That being said when doing some recent testing I have came up with a theory and Sven confirmed it is possible using the built-in macros in this thread on the GSA Forum. If my theory is correct then I can see the Article Profile, Forum Profile, and Forum Post link types becoming much more useful to me in the future and my method moving back towards the one I explained in this thread over on blackhatworld. Currently, that is all just theory, so until it is proven correct I will still class them as a lesser link type when compared to the article link type.
As you can see the article in the screenshot above the page created contains a good few hundred words, this particular one does not have any auto included media such as images or videos but some of the platform scripts can still include this by default depending on the project settings. It has been created on a fresh new page on the domain and provided the webmaster has blog comments turned off for their domain no one else can post to it meaning it is a relatively safe platform to have on your tier one directly touching your money site even when made using automatically generated content as shown in the screenshot.
The project created 1506 articles making up 10% of the total link yield, as you can see from the screenshot above 1366 of the links created are do follow making up 9% of the total link yield. Unfortunately, the users I have seen use this strategy seem to use their “kitchen sink” projects as the last tier in their pyramid meaning these articles will not index without the use of a premium indexing service meaning they offer little to no benefit to your tiers, for more information on natural indexing rates of links read my case study here.
As no other single platforms link yield makes up more than 5% of the projects total link yield I will be skipping their full breakdown and moving onto what I consider to be the useful links created from the project.
Out Of All Those Links, What Was Worth It?
The screenshot above shows the projects link yield made up of platforms that I currently consider to be useful, I have a future post planned to cover each of the platforms, what they build, if I think they are useful, potential uses for that link type and the links I consider safe for tier one but for now you are just going to have to take my word for it.
I currently consider articles, social networks, and wikis to be the A Grade platforms with blog comment, guestbook and image comments being B Grade platforms provided their domains produce a do follow link. I currently see trackbacks as a gray area, I used to use them in my tier three campaigns along with the rest of the B Grade platforms but decided to drop them a few month back after a friend said he traced a potential problem in his pyramid to their use.
As you can see in the screenshot above only 1815 of the links are do follow meaning I consider just over 12% of the projects total link yield to be useful by my current standards. That being said, this does presume you are using a premium indexing service on the article, wiki and social network links created in the tier else I would drop them too.
What I make Of It All
As the users I have seen use kitchen sink projects seem to use them as a tier three I can only presume that they are trying to use them to index their tier one and two links by sending crawlers through the pyramid. If they wanted to use it to add additional power to their tier two they would be much better using another contextual tier and sending their links to an indexing service.
The screenshot above shows the link yield from the “kitchen sink” project after I have removed all link platforms other than the ones I would currently leave in my tier three projects. As you can see there are only 407 links remaining, that’s under 3% of the projects total link yield and keep in mind it took the project 48 hours to create them!
The screenshot above shows the verified link yield for a different project I created that I currently use on live campaigns to index links. It uses a list filtered using the method I explain in this post and the project is set up as I explain in this post it took less than three minutes to reach 532 verified do follow links beating the beating the link yield of kitchen sink campaign that took two full days.
I hope this post has helped some readers understand my frustration with the term kitchen sink as well as when people actually use these types of campaigns on their live projects. If you are managing to rank with them then great but if not perhaps it is time to try something new. If you are a user currently using a kitchen sink campaign then imagine if you had the project above running for 48 hours rather than the “kitchen sink” project, imagine the difference it could have on your whole campaign!