Welcome to my introduction to my Black Hat SEO case study. If you are on this blog then I hope you are familiar with the concept of black hate search engine optimization but on the off-chance you are not it is essentially attempting to game the search engines guidelines to get ahead. Although it can be profitable at times I have previously spoken about how penguin 4 and a number of unannounced updates have smashed my money site network to pieces.
Since taking these hits and deciding to take a break from black hat SEO I have started a white hat SEO site to see how it performs. I have also started planning for a YouTube-based traffic leaking project that I am extremely excited about.
As some of you may know, Google recently rolled out their Fred update and true to fashion, one of my sites took a hit as seen in the screenshot above. Now let me quickly confirm, I am under no illusion that Google owe me anything. I know I am going against their guidelines and run the risk of having my sites hit by them in doing so.
The main thing that annoyed me about the Fred update was that a large number of people have been hit by it and to this day Google have not released any official guidelines on what to do if you were affected or how to prevent your future sites from being hit. There are a large number of people in this thread over on Blackhatworld as well as this thread over on the GSA forum who have been affected currently attempting to fault find the rollout.
Due to this lack of information from Google about what Fred targets I have decided to cut my break from black hat SEO short and start a new project. Now I have no idea if this is going to work as I have previously spoken about all of my black hat sites since Penguin 4 have been negatively affected in some way. On the flip side I have managed to tweak each site when moving forward to try and work out the cause so with any luck this project will work and start earning and allow me to scale back into the black hat SEO space.
How I Plan To Do This
I plan to break the majority of the guidelines I am attempting to stick to in my white hat case study in an attempt to decrease the time it takes for my site to rank and begin to earn money. As many of you know, private blog networks are not my thing for a number of reasons so I plan to use nothing but automated tools in this case study.
Here are a few of the Google guidelines as well as how this project will be breaking them.
Automatically Generated Content
Going against Googles guideline I will be using nothing but spun content with the majority, if not all of it automatically generated and spun being unreadable to a human. I plan to use an automated tool to meet my content needs for this project although I may use some manually human spun content for some tier one properties.
With GSA releasing their content generation tool soon I will probably invest in that as Sven has always been very responsive to suggestions for his tools from his users where as the Kontent Machine team have only ever replied to my emails saying my suggestion is not currently supported and a few years later……it still isn’t.
Participating In Link Schemes And Using Automated Tools
GSA Search Engine Ranker will be doing the vast majority of the heavy lifting for the projects link requirements with one of the automated web 2.0 tools I have access to helping it out in a few areas as required although I am considering using a manual web 2.0 service for tier one.
Although I am still toying with how I want to tiers to play out as well as if I will or will not use non-contextual link types for this project the diagram below gives a very rough indication of my current thinking. It uses a traffic light system of Green, Yellow, and Red for how safe/stable I currently believe each of the link types are.
The automated content generation tools I plan to use for the project essentially scrape various search engines and article directories for relevant content and then automatically spin it. The backlink creation tools will then take this content and post it online at various locations for me to artificially create link juice towards my website. As you can see from the tiered diagram above, there will be plenty of scraped content being used.
Participating In Affiliate Programs Without Adding Sufficient Value
As this money site will be using the Amazon Affiliate program my money site content will be human wrote and add value to the reader as I want to try and push people to Amazon who read past my initial comparison table.
That being said, the pages will be very aggressive in attempting to get readers to Amazon. I counted the links on a page on one of my other sites and a 2300 word article contains 73 amazon affiliate links and I expect this project to be similar.
I have read a number of people reporting that Fred is targeting over optimization of ads and affiliate links by I posted some of my own findings over on the GSA forum and I am leaning more towards it being based around over optimization of anchor text ratios.
Time For A Prediction
In all honesty, I don’t think this case study is going to work in the traditional way of ranking a site in the top three for a keyword to pull traffic. I have tried a number of things since Penguin 4 and never been able to get its traffic to life off but I have learned a fair bit from those test sites and changed a fair few things in my strategy over the past few month.
On the flip side, I have been looking into packing a large number of secondary and LSI keywords into articles to try and pull enough traffic to earn from the site while not being ranked top three for any of its main keywords.
In an attempt to maximize my chances of achieving both, I have provided my writer for this project with templates that target a specific keyword while having a large number of secondary, LSI and buzz keywords such as “cheap” and “review” throughout the article to see how it goes.