Connecting to SQL Server 2005 from another computer at home

I have a computer (PC) in my home office and my laptop. I have installed SQL Server 2005 Developer edition on my work computer and created the main database that I’ll be using for development work. I’ve also got SQL Server installed on my laptop but I’ll be only using the SQL Server Management Studio on the laptop to keep things simple. I do not want multiple copies of the database everywhere as it would be hard to to keep them updated all the time. The idea behind this is that I will be programming from both my PC and laptop and will be accessing just the database on my PC. However I ran into problems when I tried connecting to SQL Server 2005 on my PC from my laptop. I just couldn’t see the database engine either in the local or network servers.

Here’s what you need to do if you’re having the same problem. First make sure that the instance of SQL Server that you’re trying to connect to has got remote connections enabled. Go into SQL Server Surface Area Configuration and click on ‘Remote Connections’ and make sure that you’ve selected ‘Local and remote connections’ and also TCP/IP or both TCP/IP and Named Pipes. If things don’t work out, then I suggest you disable Windows Firewall and test again. If that works, then you need to create an exception in your firewall to allow connections to SQL Sever Management Studio (SSMS) and open the port 1433 (or 1434).

If you still can’t find your database engine listed in the connection box, then type in the following:

tcp:<computer_name>,1433

We are explicitly telling SSMS that we’re connecting through TCP protocol and going through port 1433. I still can’t see the database engine on my PC listed on my laptop but when I type in the above name in the ‘Server Name’, it connects successfully. I’m also connecting through SQL Server Authentication rather than Windows Authentication to make things easy.

Source version control using SVN

A good programmer always starts a project with source control in mind. Not only is this good programming practice but it also ensures that the hard work that you’re putting in is not going to waste. If you’re working for a reputable company, chances are that they’ve already got source control guidelines in place. However if you tend to be doing personal projects, you are more likely to get carried away and not use any source control of any sort.

It’s plain laziness, nothing more! However the consequences can be devasting. For example, I was working on a project for a week and I’d have to say that I spent probably 4 days on a single PHP page, coding and testing it and when it was time to upload it to my server, I accidentally overwrote my local copy with the outdated one from my server. I went crazy and tried to find a backup somewhere but I couldn’t get anything with my latest codes.

Lesson learned and I don’t want to spend time re-writing what I’ve already coded, so now before I start a new project, I make sure that I’ve got source control in place. For this new project that I’m tackling at the moment, I’m using unfuddle.com free service which gives me 200MB of storage for my repository. I’ll be the only person making changes to the codebase but having the source control is important to me now after what happened. I’ve downloaded TortoiseSVN to easily checkout/checking from the repository from windows interface but I might try to download the Ankh plugin for Visual Studio as well.

So far everything is going well and now the next thing that I need sorting out is to learn how to use Fluent nHibernate before the project can kick off.

Website hacked, now what?

When you have a website, there’s always the possibility that someone will try to hack into your site sooner or later.If you operate a successful site, jealous people will want to bring your site down but that’s not the only reason. Your website could also be hacked into if there’s a security hole. Although I try my best to stay on top of the security of the sites that I operate, one of my website got hacked on the 30th April 2010. The reason of the attack was for the hacker to put some links on my website.

It might seem that the attack was not catastrophic after all since only links were uploaded but if you look at it from my point of view, you will see how bad this hack is. First of all, the hacker was able to delete files from my server and create new directories and files. This was all done through a script which he managed to upload to my site. He deleted only one file on my site but that was enough to bring pretty much the whole of my website down because everything is controlled by the .htaccess file which was deleted. This resulted in 404 errors (page not found) which was not good for my traffic (traffic plummetted to nearly zero), good user experience and the almighty Google. Google crawls my site everyday without fault and it found so many 404s on that day that I was scared it was going to removed all these pages from its index. Fortunately this didn’t happen.

However the hacker created a few directories where he uploaded pages with loads of links on my site. Now my site had nothing to do with recipes but the hacker uploaded recipe links on my site and ping about a 100 servers including Google, yahoo and other search engines. Most of the sites which were pinged followed the request and starting crawling the recipe pages on my site. This is the worst kind of attack because it sets your site as a spamming site. Few days after the attack, Google Webmaster Tools had a message for me which stated that the pages with recipe links that it crawled looked spammy and they were going to remove them from the index which was absolutely fine with me. I was relieved that they didn’t penalise me in any way for having these spammy links on my site.

To make things worst, I was on holiday when my website got hacked. Fortunately for me, I like checking my website stats everyday and when I saw a drop in traffic, I thought it had something to do with my hosting company (the server was down). I was really annoyed at the hosting company because it never happened to me before that my site was down for such a long time but that was not the case in reality. My second thought was that Google was not sending traffic to me (probably some penalty, lost of organic rankings etc). But while I was investigating the problem, I saw that my website was returning 404 pages for nearly every content on the site. This is where it clicked. I looked for the reason and realised my .htaccess file was no longer there. After getting the hosting company to get a backup file up, the website was operating fine again. I then changed all passwords and analysed what went wrong.

I saw files and directories had been created and that could only mean one thing – my password had been compromised. I don’t know how it happened but it could have been worst but fortunately I was able to get the site up and running the following day. If I didn’t notice the problem just after it happened, I would have lost all traffic for at least a week along with the income that the site makes but most importantly, it would take the site a lot longer to rank back afterwards.