Tracking PSA ads for maximum adsense revenue

When creating adsense ads, I usually tend to omit the part which allows you to display an alternate ad url in case adsense cannot find any adverts to display in its inventory. This can be a costly mistake if your website is showing Public Service Ads (PSA) quite a lot.

The options that you have in adsense is to have a solid colour instead of the google ads if not matching ads can be found or have a collapsing ad unit or supply an alternate ad to display. The solid colour can be used to match your background but the problem is that it will be an empty space which is not going to earn you any money and look rather odd from the layout point of view. The collapsing ad unit is quite clever although there’s a bit of setup to be done and although it’s not going to have blank space, it will nevertheless hurt your earnings. Supplying an alternate url for another advert is the best choice – you can have your own banners to promote your own services/products or have other banners through affiliates which can earn you some money rather than nothing.

I opted to supply my own alternate ad url and decided to log the number of times PSA ads were showing on my website. This would help me understand how much this is impacting my revenue but I also wanted to know which pages were serving these nasty PSA ads as well. So I coded the alternate ad url so that it recorded each time a PSA was shown on my site as well as the page which was showing the PSA. Within 12hrs, I’ve already got 6 PSAs and all of them were coming from the same page. I guess the problem is that no matching ads could be found for that particular page because of the content (text copy) so I probably need to rework the keywords for that page.

I’m quite happy though that I implemented this solution because now I can see where PSA ads are being displayed as well as how many times they are appearing and on top of that, I’m not losing out because an alternate ad is being shown instead of the PSA!

And the rankings are back again

Over the last few weeks, I’ve been observing the fickleness of the google uk SERPs and to say that they are volatile is probably the right thing to do. If you are enjoying top positions for your keywords, you may want to keep on doing the good things that you are to stay ahead in the serps, whether it’s providing good content and good marketing (link building).

For those keywords which I’ve been talking about, they are now at #8 on the first page after disappearing completely from the SERPs. It is not so much of a good news when you think that it was #4 a week ago but then again being on the first page among 55 million competing pages is still good and better than being nowhere in the serps. I must admit that I was really inclined to make some changes in the text copy for that webpage because it was gone for quite a while. It took 7 days to come back in the serps. It disappeared on the 22 of March 2010 and it came back today 29 of March 2010. Let’s see what happens next!

Btw indexed pages in google seem to be going up and down a lot lately.

The importance of checking website server logs

Very rarely do I find the time to check the raw access logs for my websites. This is mainly because I’ve got Google Analytics installed and since they provide pretty much all of the information I require, the need to check my server logs is not really necessary.

However, Google Analytics does not show you which search engine bots has been crawling your website because the bots do not execute javascript which analytics uses to track visitors to the site. I had to download my server logs and check whether a particular webpage has been crawled. I did a search for “Googlebot” and got an audit trail of which pages have been crawled for the past month or so but I also noticed a completely different IP for Googlebot. I know the standard IP range starts with 66 (eg 66.249.65.49) but this was one 114.152.179.5. So I did a reverse DNS check on that IP and guess what, the IP which claimed to be googlebot is not googlebot at all. It’s for someone from japan who’s done a program to leech content off my website.

The server logs revealed that it has requested a significant amount of pages and I thought that google was really loving my website and decided to do a deep crawl but in reality, somebody was stealing content off my site. You can guess my frustration! I have now blocked the IP address through htaccess but I will be checking my raw access logs more often in an attempt to ban future leechers. If you’re a leecher and reading this, beware because I’m on to you now!

Choosing between .uk or .com for domain name

Onto my next project, I wanted to start a website but was looking to target an international audience. The idea behind this is that the more people I can get to the website, the better it is for me and there’s no reason why I can’t do this because English is spoken in many countries like UK and USA.

The population of the UK is around 61,000,000 and that of USA is 300,000,000. This means that United States of America has 5 times more people than here in England. It makes sense to target the USA as well and since the website is going to be mostly informative, it shouldn’t really cause a problem because I wouldn’t have to target services locally.

However people in the UK tend to click on .uk website extensions more than any other websites. Research carried out by Nominet showed that 77% or British are more likely to favour .uk domains rather than .com which means that if I register a .com website and manage to get a first page google ranking, I am more likely not to get any clicks from people searching in the UK. In America, people will favour either .com domains or .us, .net or even .org and a .uk domain is not really appealing to them. Even if you benefit from a country level domain for SEO purposes, you will not do well in terms of clickthroughs because people are biased towards domain name extensions. I part of that group as well and when I do searches in google.co.uk and find .com websites listed, I usually skip them until I reach a .co.uk or .org.uk listing. This normally happens when I’m looking for something specific to the UK.

So I was faced with the difficult decision of choosing the best domain extension for my website now. I then ran a search for the number of searches according to country (UK and USA) and I found out that the keywords received 40,500 searches locally for the month of Feb for UK compared to 1600 for USA. Now this is a significant difference and the conclusion becomes obvious. I’d rather focus on the UK market than the USA because of the traffic volume and this means that going forward with a .co.uk domain is the way forward.

Research is an important part of SEO and targetting your local market is made easier through a country specific TLD. It is pointless to have a good ranking website but an extremely low clickthrough rate because people think that your website is not relevant to them because of incorrect domain name extension.

Google UK SERPs change once again

After much enjoyment of being at number #4 in the SERPs for a popular phrase, I today found out that my webpage has disappeared once again. It was only a week ago that google reinstated my rankings (actually improved my rankings because I was ranked #5 or #6 before) and it was after a really long wait of 4 weeks. Now everything seem back to how it was again. I believe there are changes which are coming our way and that’s why the UK Serps are not stable for the time being. It is really upsetting not being able to find the webpage out of 60 result pages containing 10 results each and it seems the same thing has happened again – the ranking for the webpage has just vanished once more.

I’m not going to be doing anything and hope the rankings are going to come back by themselves again. I shall wait and if things do not pick up after 4-6 weeks, then I’ll start making the changes which I believe could be the cause.

Dropped out of SERPs completely for some keywords

In an earlier post about disappearing from SERPs completely, I thought that too many internal links in the sitewide navigation of my site were causing the penalty for that particular page to be removed from the SERPs without any trace of it. I was hasty to remove the link to that webpage in an attempt to regain my previous rankings but then it occured  to me that the other pages which were in the sitewide navigation would have shown similar penalties if that was the case. 5 days after removing the link, I re-included the link with the same anchor text as before because I was convinced it was not the case.

I then thought that it must have been a temporary glitch in the system which would have dropped the webpage for its keywords from the rankings. The webpage was ranking among 50 million pages without any external links (or maybe just 1 link from an article directory) and that shows that it had the domain and page level trust from google and that alone was enough to rank the webpage on the first page of search engine results. I decided to wait for a couple of weeks and see if the rankings would come back.

After 3 weeks, I was beginning to be really nervous as the rankings were not coming back. I wondered whether there was an Over Optimisation Penalty (OOP) on that page or whether something else was causing the problem. I ran the webpage though a spam checker tool and there was nothing much going on there, nothing spammy at all. I decided to validate the webpage through W3C standards and there were some errors but nothing major to cause disappearance from the SERPs. So the next thing was to look at the over optimisation filter.

I found that the keywords were underlined, bolded, italised as well. Now this looks too spammy, right? Well I concluded that I must have tipped over the OOP filter and that’s why the webpage was not ranking. Mind you that the webpage in question was still indexed and had a PageRank of 1 still. So the problem was just with ranking for those keywords it was optimised for. I removed the underline, bold and italics from the content and changed my signature on an active forum for link to that webpage and made a post hoping that it will get crawled quickly and Google will re-evaluate the webpage and re-rank it.

Next morning when I checked the SERPs, I had my ranking back. Wow, I thought! Things have worked out in the end but wait a minute, was the penalty caused by the over optimisation of that webpage? I decided to check my server logs to see if google fetched (crawled) that webpage the previous day after I made the changes but I was shocked to find out that it wasn’t the case. Now this means that none of the changes I made had any effect on getting back my rankings and this was confirmed when I viewed a cached copy of the webpage in Google.

I couldn’t leave the changes in place because on the next crawl google would re-evaluate the changes and adjust the rankings accordingly and since the changes did not bring back my rankings, I decided that I was better off removing them and bringing the webpage back to its original state. This is what I did and now I’ll be checking the SERPs over the next few days to see if the rankings disappear again.

The conclusion is that it took nearly 4 weeks for google to re-instate my rankings without me having to do anything. This means that if you see your rankings disappear for any reason, before you start making changes, wait a couple of weeks (as long as 6 weeks) and you might see your rankings back again. Otherwise you might end up messing everything up and lose your rankings forever!