How to tackle Google Maydate Update?

Sometime around 28 April 2010 t0 3rd May 2010, many webmasters noticed an unusual change in the SERPs. This has been named the Google Mayday Update. The ranking algorithm was changed to make search more relevant as always. But what was this update all about? This change is most noticeable for long tail keywords rankings. Many large sites have seen a drop in traffic because they were no longer ranking for the long tail keywords (or have been dropped several places in the SERPs) and this has consequently plumetted their traffic.

Before the Google Maydate Update, a website was able to rank for long tail keywords based on domain authority. So if you had a website about finance with a high PR homepage, domain trust and good rankings for your primary keywords, then you would be able to get long tail traffic easily (eg rank for best car insurance for young drivers) if one of your finance articles mentioned a few of the long tail keywords being searched for. This was true even if your article was placed quite far from the homepage (many clicks to get to it) and also if there were few or none links pointing to it.

Solution for the Mayday update

Many ecommerce websites use generic description of products from manufacturers and as you can guess, this give rise to duplication. Now the problem is that as a customer who wants to by a particular product, you don’t want to click on each of the results on google when you have done a search and end up seeing the same description and everything for a product. You want to see something unique, something that’s going to help you in your decision making process. This is why google has started to filter out similar pages with the Mayday Update. This is supposed to help with user experience and bring more relevance to searches.

Your first step if you have been hit by the Mayday Update is to look at the pages on your site and analyse them. If you have the same content that’s practically on everyone else’s site, then you have a problem. You need unique and original content for your site but that might be hard if you have thousands of pages. User generated content is a way to add uniqueness to your product pages. You can give users the facility to review the products on your website. Don’t just show the good reviews and filter out the bad ones though! There needs to be a balance of reviews, both good and bad, otherwise people are going to find it unnatural and you’ll eventually lose potential customers (more on that some other time).

The other thing you should do is create more links to the inner pages of your site. Many people like getting links to their homepage. That’s good if you want your homepage to rank but if your other pages are buried deep down in your site, like requiring 5 clicks to get to it, well then chances are google will not index it but if it does, it may not rank it well. The closer to the homepage the better but if you can’t do that, then you need external links to point to these deep pages. This will show to Google that your page contains worthy content.

Conclusion

What many people fail to realise is that you do not need to stay ahead of Google to ensure long term ranking. Google’s aim is to provide the most relevant and accurate results for a given search query. If you align Google’s aim with your website, you’ve already secured your website’s future. So instead of changing your website everytime Google makes an algorithm change, you just need to focus on creating a better user experience for your visitors and delivering better content/services to help them out.

Should you update your website/blog regularly to increase your rankings?

Many people believe that updating your website regularly is the way to achieve higher rankings in the search engines. This is not entirely true and I’ll explain why. As the web index grows bigger and bigger, search engines like Google and Bing are looking for better ways to determine what content is worth including in their index and which ones to drop. Duplicate content is obviously not going to work for rankings and if you’re rewriting content by saying the same thing in a different way, then this will not work as well. You need to be adding something original to your articles to make them stand out from all the other articles about the same subject. This is  the long term strategy and will protect you against future algorithm changes just like Google Caffeine.

Now coming back to why updating your website/blog regularly is not worth it if you’re just recycling content, you need to understand how rankings come into the play. Links play an important role in increasings your SERPs but that’s not everything. Search engines try to provide the best results for any given query; that’s their aim and hence they make algorithm changes every now and then to refine the results you see for a more perfect match. If you were to do a cosmetic surgery for some reason or another, you would go to a professional, wouldn’t you? You would consider a surgeon who is known to deliver results and this can be checked by looking at the success rates and past customer experiences and the like. In the search engine world, the same principle applies. If you own a website about “cosmetic surgery”, then you would need to have pages which talk about the different kinds of surgery available, for example, liposuction, nose job, botox etc. By having multiple pages about the different aspects of “cosmetic surgery”, you show off your expertise in this area. You will be trusted in the “cosmetic surgery” industry and search engines will love you for that. This is how domain authority is achieved.

But wait, it’s not as simple as it looks though. If you’re not providing compelling and original content which is of value of visitors of your site, then you wouldn’t move up the rankings. Each new page that you put up on your site can be a boost for the ranking of your keywords only if it is on-topic and original. Google needs to think it’s good enough to keep in its index and by targetting related things to your keywords, you would increase your power of ranking for those keywords.

Creating new content on a daily/weekly/monthly basis can bring you extra traffic provided your content is unique. If these articles address related terms for the theme of your website, you will see move up the SERP over time as you gain more trust from the search engines due to the targetting of secondary keywords/long tail keywords. However if you’re adding articles just for the sake of keeping your website updated frequently, then you will not see any benefits from it.

New content will definitely bring GoogleBot to your site more often but after Google has crawled your new content, it won’t index it if it offers nothing of value. Therefore it is better to take the time required to write a good quality article rather than rushing to write dozens of articles which people are not going to appreciate. Don’t write for the sake of writing, write to address problems that your visitors may encounter, give information that will be helpful to the visitors and in time, you’ll reap the benefits of this long term strategy.

Creating multisites with WordPress 3.0 is not for everyone!

A lot of people want to be able to run as many sites as they want with just one installation of WordPress. Before WordPress version 3.0, you had to use WordPress MU (Multi User) which was not in line with the actual WordPress codes. So basically you would be able to have any number of sites running on WordPress MU but you may miss out on some functionalities. This was not such a big deal for many people who liked the idea of running many websites/blogs from just 1 installation but now with the release of WordPress 3.0, this has been made even more simple.

The greatest benefit of being able to have a many sites on one single wordpress installation is maintenance. You just need to update your files in one place rather than update it in several places as soon as a new version comes out. It is advisable to upgrade to the latest stable version of WordPress as soon as it’s made available. So this is a big time saver in terms of updating but as well as launching new sites.

When I first tried WordPress MU, I was very keen to get my sites running but I eventually found out that to be able to run different domains on WordPress MU, you need to get a wildcard mapping for subdomains. Now because WordPress MU has been merged into WordPress 3.0, you have the same problem. I remember how difficult it was to set up and I eventually gave up.

So if you have your main blog at www.example.com and want www.myotherdomain.com to be hosted on WordPress 3.0, then you’ll have to do a wildcard mapping. Now if you’ve got a dedicated server, you can get it done quite easily but if you’re on shared hosting, then this may be a problem because some hosting companies don’t allow wildcard mapping. If you manage to do it though, there will be another problem. If you had a subdomain like subdomain.example.com, it will stop working because of the wildcard now. So you’ll have to come up with a way to handle that.

I think the idea of hosting multiple sites/blogs is very appealing especially if you want to create niche websites with just a couple of pages. However WordPress 3.0 is not ideal for that purpose nor is WordPress MU. On the other hand WP Hive which is a plugin for WordPress works brillantly.

When I heard that WordPress 3.0 allows multisite creation, the first thought that came to mind was that WP Hive will become dead (redundant) now but that’s not the case. The official documentation of WordPress 3.0 shows that you need wildcard mapping whereas with WP Hive, you only need to add the domain you want have have wordpress on as an addon domain.

I like simplicity above everything else. For those on shared hosting, I suggest you have a look at WP Hive if having multisites/multiblogs on 1 wordpress installation is what you need.

Website hacked, now what?

When you have a website, there’s always the possibility that someone will try to hack into your site sooner or later.If you operate a successful site, jealous people will want to bring your site down but that’s not the only reason. Your website could also be hacked into if there’s a security hole. Although I try my best to stay on top of the security of the sites that I operate, one of my website got hacked on the 30th April 2010. The reason of the attack was for the hacker to put some links on my website.

It might seem that the attack was not catastrophic after all since only links were uploaded but if you look at it from my point of view, you will see how bad this hack is. First of all, the hacker was able to delete files from my server and create new directories and files. This was all done through a script which he managed to upload to my site. He deleted only one file on my site but that was enough to bring pretty much the whole of my website down because everything is controlled by the .htaccess file which was deleted. This resulted in 404 errors (page not found) which was not good for my traffic (traffic plummetted to nearly zero), good user experience and the almighty Google. Google crawls my site everyday without fault and it found so many 404s on that day that I was scared it was going to removed all these pages from its index. Fortunately this didn’t happen.

However the hacker created a few directories where he uploaded pages with loads of links on my site. Now my site had nothing to do with recipes but the hacker uploaded recipe links on my site and ping about a 100 servers including Google, yahoo and other search engines. Most of the sites which were pinged followed the request and starting crawling the recipe pages on my site. This is the worst kind of attack because it sets your site as a spamming site. Few days after the attack, Google Webmaster Tools had a message for me which stated that the pages with recipe links that it crawled looked spammy and they were going to remove them from the index which was absolutely fine with me. I was relieved that they didn’t penalise me in any way for having these spammy links on my site.

To make things worst, I was on holiday when my website got hacked. Fortunately for me, I like checking my website stats everyday and when I saw a drop in traffic, I thought it had something to do with my hosting company (the server was down). I was really annoyed at the hosting company because it never happened to me before that my site was down for such a long time but that was not the case in reality. My second thought was that Google was not sending traffic to me (probably some penalty, lost of organic rankings etc). But while I was investigating the problem, I saw that my website was returning 404 pages for nearly every content on the site. This is where it clicked. I looked for the reason and realised my .htaccess file was no longer there. After getting the hosting company to get a backup file up, the website was operating fine again. I then changed all passwords and analysed what went wrong.

I saw files and directories had been created and that could only mean one thing – my password had been compromised. I don’t know how it happened but it could have been worst but fortunately I was able to get the site up and running the following day. If I didn’t notice the problem just after it happened, I would have lost all traffic for at least a week along with the income that the site makes but most importantly, it would take the site a lot longer to rank back afterwards.

Tracking PSA ads for maximum adsense revenue

When creating adsense ads, I usually tend to omit the part which allows you to display an alternate ad url in case adsense cannot find any adverts to display in its inventory. This can be a costly mistake if your website is showing Public Service Ads (PSA) quite a lot.

The options that you have in adsense is to have a solid colour instead of the google ads if not matching ads can be found or have a collapsing ad unit or supply an alternate ad to display. The solid colour can be used to match your background but the problem is that it will be an empty space which is not going to earn you any money and look rather odd from the layout point of view. The collapsing ad unit is quite clever although there’s a bit of setup to be done and although it’s not going to have blank space, it will nevertheless hurt your earnings. Supplying an alternate url for another advert is the best choice – you can have your own banners to promote your own services/products or have other banners through affiliates which can earn you some money rather than nothing.

I opted to supply my own alternate ad url and decided to log the number of times PSA ads were showing on my website. This would help me understand how much this is impacting my revenue but I also wanted to know which pages were serving these nasty PSA ads as well. So I coded the alternate ad url so that it recorded each time a PSA was shown on my site as well as the page which was showing the PSA. Within 12hrs, I’ve already got 6 PSAs and all of them were coming from the same page. I guess the problem is that no matching ads could be found for that particular page because of the content (text copy) so I probably need to rework the keywords for that page.

I’m quite happy though that I implemented this solution because now I can see where PSA ads are being displayed as well as how many times they are appearing and on top of that, I’m not losing out because an alternate ad is being shown instead of the PSA!

And the rankings are back again

Over the last few weeks, I’ve been observing the fickleness of the google uk SERPs and to say that they are volatile is probably the right thing to do. If you are enjoying top positions for your keywords, you may want to keep on doing the good things that you are to stay ahead in the serps, whether it’s providing good content and good marketing (link building).

For those keywords which I’ve been talking about, they are now at #8 on the first page after disappearing completely from the SERPs. It is not so much of a good news when you think that it was #4 a week ago but then again being on the first page among 55 million competing pages is still good and better than being nowhere in the serps. I must admit that I was really inclined to make some changes in the text copy for that webpage because it was gone for quite a while. It took 7 days to come back in the serps. It disappeared on the 22 of March 2010 and it came back today 29 of March 2010. Let’s see what happens next!

Btw indexed pages in google seem to be going up and down a lot lately.

The importance of checking website server logs

Very rarely do I find the time to check the raw access logs for my websites. This is mainly because I’ve got Google Analytics installed and since they provide pretty much all of the information I require, the need to check my server logs is not really necessary.

However, Google Analytics does not show you which search engine bots has been crawling your website because the bots do not execute javascript which analytics uses to track visitors to the site. I had to download my server logs and check whether a particular webpage has been crawled. I did a search for “Googlebot” and got an audit trail of which pages have been crawled for the past month or so but I also noticed a completely different IP for Googlebot. I know the standard IP range starts with 66 (eg 66.249.65.49) but this was one 114.152.179.5. So I did a reverse DNS check on that IP and guess what, the IP which claimed to be googlebot is not googlebot at all. It’s for someone from japan who’s done a program to leech content off my website.

The server logs revealed that it has requested a significant amount of pages and I thought that google was really loving my website and decided to do a deep crawl but in reality, somebody was stealing content off my site. You can guess my frustration! I have now blocked the IP address through htaccess but I will be checking my raw access logs more often in an attempt to ban future leechers. If you’re a leecher and reading this, beware because I’m on to you now!

Choosing between .uk or .com for domain name

Onto my next project, I wanted to start a website but was looking to target an international audience. The idea behind this is that the more people I can get to the website, the better it is for me and there’s no reason why I can’t do this because English is spoken in many countries like UK and USA.

The population of the UK is around 61,000,000 and that of USA is 300,000,000. This means that United States of America has 5 times more people than here in England. It makes sense to target the USA as well and since the website is going to be mostly informative, it shouldn’t really cause a problem because I wouldn’t have to target services locally.

However people in the UK tend to click on .uk website extensions more than any other websites. Research carried out by Nominet showed that 77% or British are more likely to favour .uk domains rather than .com which means that if I register a .com website and manage to get a first page google ranking, I am more likely not to get any clicks from people searching in the UK. In America, people will favour either .com domains or .us, .net or even .org and a .uk domain is not really appealing to them. Even if you benefit from a country level domain for SEO purposes, you will not do well in terms of clickthroughs because people are biased towards domain name extensions. I part of that group as well and when I do searches in google.co.uk and find .com websites listed, I usually skip them until I reach a .co.uk or .org.uk listing. This normally happens when I’m looking for something specific to the UK.

So I was faced with the difficult decision of choosing the best domain extension for my website now. I then ran a search for the number of searches according to country (UK and USA) and I found out that the keywords received 40,500 searches locally for the month of Feb for UK compared to 1600 for USA. Now this is a significant difference and the conclusion becomes obvious. I’d rather focus on the UK market than the USA because of the traffic volume and this means that going forward with a .co.uk domain is the way forward.

Research is an important part of SEO and targetting your local market is made easier through a country specific TLD. It is pointless to have a good ranking website but an extremely low clickthrough rate because people think that your website is not relevant to them because of incorrect domain name extension.

Google UK SERPs change once again

After much enjoyment of being at number #4 in the SERPs for a popular phrase, I today found out that my webpage has disappeared once again. It was only a week ago that google reinstated my rankings (actually improved my rankings because I was ranked #5 or #6 before) and it was after a really long wait of 4 weeks. Now everything seem back to how it was again. I believe there are changes which are coming our way and that’s why the UK Serps are not stable for the time being. It is really upsetting not being able to find the webpage out of 60 result pages containing 10 results each and it seems the same thing has happened again – the ranking for the webpage has just vanished once more.

I’m not going to be doing anything and hope the rankings are going to come back by themselves again. I shall wait and if things do not pick up after 4-6 weeks, then I’ll start making the changes which I believe could be the cause.

Dropped out of SERPs completely for some keywords

In an earlier post about disappearing from SERPs completely, I thought that too many internal links in the sitewide navigation of my site were causing the penalty for that particular page to be removed from the SERPs without any trace of it. I was hasty to remove the link to that webpage in an attempt to regain my previous rankings but then it occured  to me that the other pages which were in the sitewide navigation would have shown similar penalties if that was the case. 5 days after removing the link, I re-included the link with the same anchor text as before because I was convinced it was not the case.

I then thought that it must have been a temporary glitch in the system which would have dropped the webpage for its keywords from the rankings. The webpage was ranking among 50 million pages without any external links (or maybe just 1 link from an article directory) and that shows that it had the domain and page level trust from google and that alone was enough to rank the webpage on the first page of search engine results. I decided to wait for a couple of weeks and see if the rankings would come back.

After 3 weeks, I was beginning to be really nervous as the rankings were not coming back. I wondered whether there was an Over Optimisation Penalty (OOP) on that page or whether something else was causing the problem. I ran the webpage though a spam checker tool and there was nothing much going on there, nothing spammy at all. I decided to validate the webpage through W3C standards and there were some errors but nothing major to cause disappearance from the SERPs. So the next thing was to look at the over optimisation filter.

I found that the keywords were underlined, bolded, italised as well. Now this looks too spammy, right? Well I concluded that I must have tipped over the OOP filter and that’s why the webpage was not ranking. Mind you that the webpage in question was still indexed and had a PageRank of 1 still. So the problem was just with ranking for those keywords it was optimised for. I removed the underline, bold and italics from the content and changed my signature on an active forum for link to that webpage and made a post hoping that it will get crawled quickly and Google will re-evaluate the webpage and re-rank it.

Next morning when I checked the SERPs, I had my ranking back. Wow, I thought! Things have worked out in the end but wait a minute, was the penalty caused by the over optimisation of that webpage? I decided to check my server logs to see if google fetched (crawled) that webpage the previous day after I made the changes but I was shocked to find out that it wasn’t the case. Now this means that none of the changes I made had any effect on getting back my rankings and this was confirmed when I viewed a cached copy of the webpage in Google.

I couldn’t leave the changes in place because on the next crawl google would re-evaluate the changes and adjust the rankings accordingly and since the changes did not bring back my rankings, I decided that I was better off removing them and bringing the webpage back to its original state. This is what I did and now I’ll be checking the SERPs over the next few days to see if the rankings disappear again.

The conclusion is that it took nearly 4 weeks for google to re-instate my rankings without me having to do anything. This means that if you see your rankings disappear for any reason, before you start making changes, wait a couple of weeks (as long as 6 weeks) and you might see your rankings back again. Otherwise you might end up messing everything up and lose your rankings forever!