Integrating Paypal button with IPN and PDT in ASP.net or MVC website

I was working on a project where a subscription is needed to access parts of the website (drivingtest.cleverdodo.mu). The idea was to have a Paypal button which would allow users to pay for 1 month subscription at a time and give them instant access. The standard button (the one you get from Paypal’s website) allows you to receive money but you’d have to manually check the payment and update the customer’s record to grant him access. If you wanted your system to be notified instantly, you’ll have to dive into Instant Payment Notification (IPN) and Payment Data Transfer (PDT).

Before I show you the C# codes for Paypal integration, you’ll need to understand what IPN and PDT are. IPN is a message service; when someone has made a payment to you, Paypal will send this message to your listener. It is the most reliable way to know whether you truly have received money because the message will be sent at some point or the other, whether the user does not return to your website after completing the payment or he loses internet connection or there’s congestion and the transaction notification is delayed. However you’ll need to have a listener to capture the message. PDT on the other hand happens when the user either clicks on the “Goc back to merchant website” link (your return url) or when the user is redirected to your website immediately after purchase. If for some reason or the other, the customer does not return to your website, you will receive no notification of a successful payment.

Therefore it is important that you have both IPN and PDT but the good news is that the codes for the two are not that much different after all.

Paypal Sandbox and the Developer account

You do not want to be testing against your live business/personal account. Therefore you’ll need to register for a developer account first to get access to Paypal Sandbox which allows you to test at your heart’s will.

Developer : https://developer.paypal.com/

Once registered, just create a test business account and a personal account so that you can check whether when you buy from the personal account, the money is transferred to your business account.

You’ll also need to sign in with your test business account to https://www.sandbox.paypal.com and go to Profile -> Website Payment Preferences and turn Auto Return and Payment Data Transfer to On so that you get an Identity Token to use for PDT. I had trouble finding my token because I was looking on the Developers url rather than the Sandbox one.

How it works in my project

When a user is logged in, he’s able to see a Paypal button in his dashboard (my account area). When he clicks the button, it takes him to an intermediate page which contains a form with all the necessary values to send off to Paypal. When this page is loaded, a record is created in the order table and the order id is inserted as a hidden form field. I need this value back from Paypal to update the corresponding user subscription record.

So this is an MVC view but the core principle remains the same for any other programming language


@{
 ViewBag.Title = "Paypal Checkout";
}

@section TestHeadContent {
<meta name="robots" content="noindex" />
<script src="http://ajax.googleapis.com/ajax/libs/jquery/1.6.1/jquery.min.js"></script>
<script src="/scripts/jquery.countdown.min.js"></script>
<script type="text/javascript">
 $(document).ready(function () {
 $('#paypal-timer').countdown({ until: '+2s', format: 'MS', layout: '{sn}', onExpiry: SubmitTheForm });
 });

function SubmitTheForm() {
 $('#paypalform').submit();
 }
</script>
}

<div id="area">
<div style="min-height:400px;">
<strong>Redirecting you to the Paypal website in <span id="paypal-timer"></span> second(s)...</strong>
<form name="paypalform" id="paypalform" action="@ViewData["PaypalUrl"]" method="post" enctype="application/x-www-form-urlencoded">
 <input type="hidden" name="cmd" value="_xclick" />
 <input type="hidden" name="business" value="@System.Configuration.ConfigurationManager.AppSettings["Business"]" />
 <input type="hidden" name="amount" value="4" />
 <input type="hidden" name="currency_code" value="USD" />
 <input type="hidden" name="notify_url" value="@String.Concat(System.Configuration.ConfigurationManager.AppSettings["WebsiteUrl"], "paypal/ipn")" />
 <input type="hidden" name="cancel" value="@String.Concat(System.Configuration.ConfigurationManager.AppSettings["WebsiteUrl"], "paypal/cancel")" />
 <input type="hidden" name="return" value="@String.Concat(System.Configuration.ConfigurationManager.AppSettings["WebsiteUrl"], "paypal/pdt")" />
 <input type="hidden" name="item_name" value="1 Month Access" />
 <input type="hidden" name="custom" value="@ViewData["OrderId"]" />
</form>
</div>
</div>

The form is populated with values from the AppSettings; Business is your business email address and notify_url is the URL of the IPN listener. The form action is set to either the sandbox or live Paypal url depending on a setting in the web.config file. After 2 seconds, the form is automatically submitted.

Note : You cannot issue a post command from code behind (ie programmatically) because you will need to redirect the user to that same page afterwards for him to enter his details. Therefore you will lose your post data. That is why we’re using a form to post the values and send the users to Paypal.

Code for IPN

Usually this code will go into your business logic layer (thin controllers, fat models) but this is just to simplify things.


public ActionResult IPN()
 {
 log.Info("IPN listener invoked");

try
 {
 var formVals = new Dictionary<string, string>();
 formVals.Add("cmd", "_notify-validate");

string response = GetPayPalResponse(formVals);

if (response == "VERIFIED")
 {
 string transactionId = Request["txn_id"];
 string sAmountPaid = Request["mc_gross"];
 string orderId = Request["custom"];

//_logger.Info("IPN Verified for order " + orderID);

//validate the order
 Decimal amountPaid = 0;
 Decimal.TryParse(sAmountPaid, out amountPaid);

Order order = new Gis.Quiz.Core.Repository.Repository<Order>(_session).GetById(Convert.ToInt32(orderId));

// check these first
 bool verified = true;

string businessEmail = HttpUtility.UrlDecode(Request["business"]);
 if (String.Compare(businessEmail, System.Configuration.ConfigurationManager.AppSettings["Business"], true) != 0)
 verified = false;

string currencyCode = HttpUtility.UrlDecode(Request["mc_currency"]);
 if (String.Compare(currencyCode, "USD", true) != 0)
 verified = false;

string paymentStatus = HttpUtility.UrlDecode(Request["payment_status"]);
 if (String.Compare(paymentStatus, "Completed", true) != 0)
 verified = false;

if (!AmountPaidIsValid(order, amountPaid))
 verified = false;

log.Info("Business : " + businessEmail);
 log.Info("currency : " + currencyCode);
 log.Info("payment status : " + paymentStatus);
 log.Info("amount valid : " + AmountPaidIsValid(order, amountPaid).ToString());

// process the transaction
 if (verified)
 {
 log.Info("Trying to process ipn transaction...");

// check that we have not already processed this transaction
 if (String.Compare(order.TransactionId, transactionId, true) != 0)
 {
 order.Firstname = HttpUtility.UrlDecode(Request["first_name"]);
 order.Lastname = HttpUtility.UrlDecode(Request["last_name"]);
 order.Email = HttpUtility.UrlDecode(Request["payer_email"]);
 order.Street = HttpUtility.UrlDecode(Request["address_street"]);
 order.City = HttpUtility.UrlDecode(Request["address_city"]);
 order.State = HttpUtility.UrlDecode(Request["address_state"]);
 order.Country = HttpUtility.UrlDecode(Request["address_country"]);
 order.Zip = HttpUtility.UrlDecode(Request["address_zip"]);
 order.TransactionId = transactionId;
 order.StatusId = 1;
 order.UpdateDate = Gis.Quiz.Core.Utilities.Helper.FormatNHibernateDateTime(DateTime.Now);

UserSubscription us = new Gis.Quiz.Core.Repository.Repository<UserSubscription>(_session).GetById(order.UserId);
 if (us == null)
 {
 // create new subscription record
 us = new UserSubscription();
 us.Id = order.UserId;
 us.CreateDate = Gis.Quiz.Core.Utilities.Helper.FormatNHibernateDateTime(DateTime.Now);
 us.UpdateDate = Gis.Quiz.Core.Utilities.Helper.FormatNHibernateDateTime(DateTime.Now);
 us.SubscriptionEndDate = Gis.Quiz.Core.Utilities.Helper.FormatNHibernateDateTime(DateTime.Now.AddDays(30));
 }
 else
 {
 // update subscription record
 us.UpdateDate = Gis.Quiz.Core.Utilities.Helper.FormatNHibernateDateTime(DateTime.Now);
 us.SubscriptionEndDate = Gis.Quiz.Core.Utilities.Helper.FormatNHibernateDateTime(DateTime.Now.AddDays(30));
 }

using (ITransaction tx = _session.BeginTransaction())
 {
 _session.Save(order); // save order
 _session.Save(us); // save subscription

tx.Commit();
 }
 }
 }
 else
 {
 //let fail - this is the IPN so there is no viewer
 log.Info("ipn is not valid...");
 }
 }
 else
 log.Info("ipn not verified...");
 }
 catch (Exception ex)
 {
 log.Error("Error caught in IPN : " + ex.ToString());
 }

return View();
 }

You need to check you haven’t processed the transaction yet because you could end up doing it twice as Paypal could send you a notification through both PDT and IPN.

The PDT Code


public ActionResult PDT()
 {
 log.Info("PDT called...");

try
 {
 //_logger.Info("PDT Invoked");
 string transactionId = Request.QueryString["tx"];
 string sAmountPaid = Request.QueryString["amt"];
 string orderId = Request.QueryString["cm"];

Dictionary<string, string> formVals = new Dictionary<string, string>();
 formVals.Add("cmd", "_notify-synch");
 formVals.Add("at", System.Configuration.ConfigurationManager.AppSettings["PayPalPDTToken"]);
 formVals.Add("tx", transactionId);

string response = GetPayPalResponse(formVals);
 //_logger.Info("PDT Response received: " + response);
 if (response.StartsWith("SUCCESS"))
 {
 //_logger.Info("PDT Response received for order " + orderID);

//validate the order
 Decimal amountPaid = 0;
 Decimal.TryParse(sAmountPaid, out amountPaid);

Order order = new Gis.Quiz.Core.Repository.Repository<Order>(_session).GetById(Convert.ToInt32(orderId));

//check the amount paid
 if (AmountPaidIsValid(order, amountPaid))
 {
 // check that we have not already processed this transaction
 if (String.Compare(order.TransactionId, transactionId, true) != 0)
 {
 order.Firstname = GetPDTValue(response, "first_name");
 order.Lastname = GetPDTValue(response, "last_name");
 order.Email = GetPDTValue(response, "payer_email");
 order.Street = GetPDTValue(response, "address_street");
 order.City = GetPDTValue(response, "address_city");
 order.State = GetPDTValue(response, "address_state");
 order.Country = GetPDTValue(response, "address_country");
 order.Zip = GetPDTValue(response, "address_zip");
 order.TransactionId = transactionId;
 order.StatusId = 1;
 order.UpdateDate = Gis.Quiz.Core.Utilities.Helper.FormatNHibernateDateTime(DateTime.Now);

UserSubscription us = new Gis.Quiz.Core.Repository.Repository<UserSubscription>(_session).GetById(order.UserId);
 if (us == null)
 {
 // create new subscription record
 us = new UserSubscription();
 us.Id = order.UserId;
 us.CreateDate = Gis.Quiz.Core.Utilities.Helper.FormatNHibernateDateTime(DateTime.Now);
 us.UpdateDate = Gis.Quiz.Core.Utilities.Helper.FormatNHibernateDateTime(DateTime.Now);
 us.SubscriptionEndDate = Gis.Quiz.Core.Utilities.Helper.FormatNHibernateDateTime(DateTime.Now.AddDays(30));
 }
 else
 {
 // update subscription record
 us.UpdateDate = Gis.Quiz.Core.Utilities.Helper.FormatNHibernateDateTime(DateTime.Now);
 us.SubscriptionEndDate = Gis.Quiz.Core.Utilities.Helper.FormatNHibernateDateTime(DateTime.Now.AddDays(30));
 }

using (ITransaction tx = _session.BeginTransaction())
 {
 _session.Save(order); // save order
 _session.Save(us); // save subscription

tx.Commit();
 }

ViewData["fb"] = "Thank you for your payment. Your transaction has been completed and a receipt for your purchase has been emailed to you. You can now access the paid members area of our website by <a href=\"/myaccount\">clicking here</a>.";
 }
 else
 {
 ViewData["fb"] = "This order has already been processed.";
 }
 }
 else
 {
 //Payment amount is off, this can happen if you have a Gift cert at PayPal, be careful of this!
 ViewData["fb"] = "Payment amount received from paypal does not match order total.";
 }
 }
 else
 {
 ViewData["fb"] = "Your payment was not successful with PayPal";
 }
 }
 catch (Exception ex)
 {
 log.Error("Error caught in PDT : " + ex.ToString());
 }

return View();
 }

PayPalPDTToken is the Identity Token you got in an earlier step above.

The helper methods


/// <summary>
 /// Utility method for handling PayPal Responses
 /// </summary>
 string GetPayPalResponse(Dictionary<string, string> formVals)
 {
 bool useSandbox = Convert.ToBoolean(System.Configuration.ConfigurationManager.AppSettings["IsLocal"]);

string paypalUrl = useSandbox ? "https://www.sandbox.paypal.com/cgi-bin/webscr"
 : "https://www.paypal.com/cgi-bin/webscr";

 HttpWebRequest req = (HttpWebRequest)WebRequest.Create(paypalUrl);

// Set values for the request back
 req.Method = "POST";
 req.ContentType = "application/x-www-form-urlencoded";

byte[] param = Request.BinaryRead(Request.ContentLength);
 string strRequest = Encoding.ASCII.GetString(param);

StringBuilder sb = new StringBuilder();
 sb.Append(strRequest);

foreach (string key in formVals.Keys)
 {
 sb.AppendFormat("&{0}={1}", key, formVals[key]);
 }
 strRequest += sb.ToString();
 req.ContentLength = strRequest.Length;

//for proxy
 //WebProxy proxy = new WebProxy(new Uri("http://urlort#");
 //req.Proxy = proxy;
 //Send the request to PayPal and get the response
 string response = "";
 using (StreamWriter streamOut = new StreamWriter(req.GetRequestStream(), System.Text.Encoding.ASCII))
 {

streamOut.Write(strRequest);
 streamOut.Close();
 using (StreamReader streamIn = new StreamReader(req.GetResponse().GetResponseStream()))
 {
 response = streamIn.ReadToEnd();
 }
 }

return response;
 }

bool AmountPaidIsValid(Order order, decimal amountPaid)
 {
 bool result = true;

if (order != null)
 {
 if (order.Total > amountPaid)
 {
 //_logger.Warn("Invalid order amount to PDT/IPN: " + order.ID + "; Actual: " + amountPaid.ToString("C") + "; Should be: " + order.Total.ToString("C") + "user IP is " + Request.UserHostAddress);
 result = false;
 }
 }
 else
 {
 //_logger.Warn("Invalid order ID passed to PDT/IPN; user IP is " + Request.UserHostAddress);
 }
 return result;
 }

string GetPDTValue(string pdt, string key)
 {
 string[] keys = pdt.Split('\n');
 string thisVal = "";
 string thisKey = "";
 foreach (string s in keys)
 {
 string[] bits = s.Split('=');
 if (bits.Length > 1)
 {
 thisVal = HttpUtility.UrlDecode(bits[1]); // values are urlencoded, not htmlencoded
 thisKey = bits[0];
 if (thisKey.Equals(key, StringComparison.InvariantCultureIgnoreCase))
 break;
 }
 }
 return thisVal;
 }

Important things to remember

Here are some thing worth noting:

  • You need to be logged into your sandbox account on the same browser that you’re sending off the request to Paypal to make a sucessful transaction in testing mode.
  • You can only test IPN on a URL that’s publicly accessible. So localhost will not work.

Paypal integration is not complicated but I had to read lots on different forums/blogs and StackOverflow and I’m grateful to Kona.Web from which I’ve borrowed a lot of code.

Displaying your recent posts in Tumblr

Tumblr does not provide an easy way to show your latest posts on your blog. The infinite scrolling is nice but you want your visitors to be able to see say the last 5 posts you’ve made so they can jump straight to a specific post instead of having to read through all posts until they reach something worth reading. So to achieve this functionality, I’ve used Jquery/JavaScript.

There are 3 things to accomplish this:

  1. If you’re not already using the jquery library, you need to include it in your theme.
  2. Insert an html ul tag where you want the posts to render.
  3. Add the jQuery code to your theme.

Some themes already come with jquery and therefore there’s no need to add it again. However the library needs to appear before the jQuery code for it to work. Check for the word jquery in your theme’s html and if you can’t find any instances of it, then add it in the head section, that is within these tags <head></head>. Then right after it, place the jQuery code. So you will have the following:


<script src="//ajax.googleapis.com/ajax/libs/jquery/1.9.0/jquery.min.js"></script>

<script type="text/javascript">
$(function() {
var url = 'http://staff.tumblr.com/api/read?start=0&num=5';
var $list = $('#recent-posts');
$.ajax({
 url: url,
 type: 'GET',
 dataType: 'xml',
 success: function(data) {
 var posts = $(data).find('post');
 posts.each( function() {
 var post = $(this);
 var link = post.attr('url-with-slug');
 var title = post.children('regular-title').text();
 if (link && title) {
 $list.append('<li><a href="' + link + '">' + title + '</a></li>');
 }
 });
 }
});
});
</script>

Replace staff.tumblr.com with the hostname of your blog and if you want to change how many posts are displayed, then change the number 5 at the end of the url eg http://staff.tumblr.com/api/read?start=0&num=20. The latter will retrieve the last 20 blog posts.

Now you need to include the following somewhere in your theme where you want the posts to be displayed:


<ul id="recent-posts">
 </ul>

You might want to place that in the sidebar so that it’s prominent. And that’s it really.

Any problems, let me know 🙂

Educating Mauritians to shop online

While the Government of Mauritius have managed to portray the country as a Cyber Island, the truth is far from that. A lot of development has been made in the recent years and looking at the infrastructure, you’d surely be convinced the government services are very well advanced and organised. Contrary to popular belief, a lot of paperwork is involved in nearly everything you do in this country. Go to the Police Station to file a case and you’d be asked to sign on a paper book where the officer has handwritten your statement. The hospitals, the births & deaths registrations, the pension schemes and all other services seem to be run on paper. The concept of a central data network seems to be missing. Too few services are available through the Internet to mention yet they talk about being the leader in ICT amongst the African countries. Of course, the one-eyed man is king amongst the blind.

With all that said, you’d surely wonder how often you’d need the government services such as applying for a permit or making an application for your driving licence. You can live knowing that only once in a while you’d have to experience the frustration of all those manual tasks. However as we move towards a more sophisticated lifestyle where technology surrounds us, surely we’d want all the facilities that comes along with it and one thing that’s bound to make life easier is online shopping. Who’s got time outside their busy schedules (work or other responsibilities) to actually go shopping for the things they need nowadays?

It is unfortunate that not many e-commerce websites are yet available in Mauritius. However it’s the attitude of the people which is the main problem. No one likes change and when you’ve been used to buying your things in person and pay by cash, you’d be apprehensive of any other ways of shopping. Take for example the Clever Dodo Shop which allows people to buy things through the internet and have their orders shipped to them. The idea is very nice but people use the website as an online catalogue instead. They are happy they are able to see the product photos, descriptions and get their questions about the products answered quickly but when it comes to actually making the purchase online, they seem to get scared. They want a phone number instead and prefer to meet in person, hand over the cash and get their items.

To an extent, it seems to be a backwards mentality as in they don’t want to evolve but at the same time, I think they need to be educated about buying online. Putting your credit card details online is a risk but this has been mitigated by multiple prevention schemes and Mauritians need to be aware of that. There are countermeasures such a Verified By Visa, Paypal if you do not want to enter your credit/debit card details on every website you make a purchase, or the password based transaction offered by MCB (Mauritius Commercial Bank). You also need to look at the credibility of the website before entering into a contract with them, so checking the terms & conditions, delivery information and other details relevant to you is a must. There’s also the padlock symbol on the web browser which gets displayed when a site uses SSL (Secure Socket Layer) to ensure the encryption of whatever you’re inputting on their website which gives you re-assurance that nobody will be able to intercept your data.

I think the Mauritian government should promote buying online and raise awareness in that field because that will benefit the economy of the country as well as making shopping easier for the residents. If the attitude doesn’t change in the coming years, we’re sure to have a country stuck in the enhanced barter system.

Installing OpenCart on Windows 2008 server

After deciding not to go ahead with NopCommerce, I was left with 2 choices – either I host the eCommerce website on a Linux server or get it to run on my Windows server. I thought it would be a waste of money to buy separate hosting and therefore I tried to install OpenCart on my Windows 2008 box. Although I managed to install it successfully, there were a few problems I encountered along the way.

The Basics First

OpenCart requires MySql database and PHP to work. If you don’t have MySql installed on your Windows server, then just download the latest version of the MySql Server and run the wizard. PHP was already installed in IIS 7.5 for me as a FastCGI extension. I checked that it was working properly by trying to install WordPress (the popular blogging platform) on the Windows server and it worked flawlessly and the SEO Urls in the htaccess file was automatically translated to Web.Config equivalent (more on that later).

The Install Process

I downloaded the zip file for OpenCart directly on my server and extracted the files there instead of doing FTP as the later was faster. If your domain (for the eCommerce shop) has not been added to your DNS yet, it might be the time to do it now. Then configure the domain in IIS and copy the files from the “upload” folder for OpenCart to the physical directory you want your website to be run from in IIS. Now all you have to do is visit your shop’s url (eg myshop.com) and it will direct you to the installation wizard. If you don’t see the installation wizard, your PHP configuration is not done properly in IIS.

You will need to tick the accept terms for OpenCart and the following screen will show you what dependencies you need and what files/folders need to be written to. I found that I was missing a PHP extension (mcrypt) and I went on installing this before I proceeded. I refresh the page after mcrypt was installed and I was given the green light to proceed. The 3rd and final screen is a form where you need to enter database information like (db name, user and password) and admin details (email/password). Once you hit submit, the installation process begins and it tells you that you need to delete the “Install” folder as soon as possible, otherwise someone might trigger the installation again and overwrite your files and database.

However although it seemed that the installation went through fine, when I tried going to the shop homepage, I was redirected back to the installation screen. This was because the files which were shown as writeable was actually not writeable. The config.php file was zero bytes which means there were no configuration written to it and therefore the index.php file redirects you to the installation page.

To overcome this problem, you will need to give read/write/modify/execute access to the IUSR group and the IIS AppPool\YourAppPoolName to the files/folders mentioned during the installation wizard. I gave the parent directory (the folder where I uploaded all the OpenCart files) these access to see whether it works first. Once it was working, then I deleted the directory, recreated it and assigned the privileges needed only to those specific files/folders. It was good to see the open source ecommerce software working properly on a Windows machine.

Configuring Friendly Urls for SEO

On a Windows box, htacsess does not have any meaning. You would need to translate the rules to what windows understand and for that you will need to make use of the UrlRewrite module and the Web.Config file.


<?xml version="1.0" encoding="UTF-8"?>
<configuration>
 <system.webServer>
 <rewrite>
 <rules>
 <rule name="OpenCart" patternSyntax="Wildcard">
 <match url="*"/>
 <conditions>
 <add input="{REQUEST_FILENAME}" matchType="IsFile" negate="true"/>
 <add input="{REQUEST_FILENAME}" matchType="IsDirectory" negate="true"/>
 </conditions>
 <action type="Rewrite" url="index.php"/>
 </rule></rules>
 </rewrite>
 </system.webServer>
</configuration>

The above web.config is generated for WordPress to use friendly urls. You just need to edit the rules to get OpenCart to have search engine friendly urls. Don’t forget to choose the “Use SEO Url’s” in OpenCart administrative area (System -> Settings -> Edit -> Server).

Say you wanted products to be available at myshop.com/products/product-1. You would need a rule like this:


<rule name="Products">
<match url="^products/[w+]-d+"/>
<action type="Rewrite" url="product/product&amp;product_id={R:3}"/>
</rule>

UPDATE
Scrap what I said above regarding the configuration of seo links. You can just put the following in your Web.Config file:

</pre>
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
 <system.webServer>
 <rewrite>
 <rules>
 <rule name="opencart" stopProcessing="true">
 <match url="^([^?]*)" />
 <conditions>
 <add input="{REQUEST_FILENAME}" matchType="IsFile" negate="true"/>
 <add input="{REQUEST_FILENAME}" matchType="IsDirectory" negate="true"/>
 <add input="{REQUEST_URI}" pattern=".*\.(ico|gif|jpg|jpeg|png|js|css)" negate="true"/>
 </conditions>
 <action type="Rewrite" url="index.php?_route_={R:1}" appendQueryString="true"/>
 </rule>
 </rules>
 </rewrite>
 </system.webServer>
</configuration>

Then, when you have set OpenCart to use SEO Links, you can get products/categories/information pages displayed in friendly URLs. It’s exactly the rules they have in the htaccess file.

Creating an event in MySql

If you want an action to happen at a specific time or at regular intervals, events in MySql is the way to go. Triggers are only good as a reaction to something that happens to the tables in your database, eg a new column added. In my case, I needed to clear off session data and I decided to create an event as follows:


CREATE EVENT SessionRemover
 ON SCHEDULE
 EVERY 5 MINUTE
 DO
 DELETE FROM Session WHERE Expires < NOW();

The statement failed for me because I did not have the event privilege. So I had to grant that to the user I was using for connecting to the database:


GRANT EVENT ON *.* TO someone@"%" IDENTIFIED BY ‘password’ WITH GRANT OPTION;

The event was created but it never ran. When I investigated the problem, it turned out that the event scheduler was not running. So I had to run this sql as well:

SET GLOBAL event_scheduler = ON;

Everything worked as expected until the server got restarted and my session count got up to 8000!!! I checked whether the event scheduler was running with this sql:

SHOW PROCESSLIST

But I did not find a user with the name ‘event_scheduler’ in the list. To make the event scheduler run event when the server restarts, you have to add it to the configuration file. I’m using Windows 2008 and I had to add the following line to ini file located at YourMySqlInstallationDir/MySql Server 5.1/my.ini
event_scheduler=ON
I couldn’t find any entry for this statement, so I added it to the end of the my.ini file.

Why I chose Tumblr over Posterous

For a site that I was working on, I wanted a blogging platform that was easy to use and had the social aspect to it. For example this blog which is powered by WordPress remains disconnected from the rest of the world because I don’t promote my posts at all. The only visitors that I get here is people who have been referred from search engines, Google most importantly. This means I’m at the whims of Google for people to know what I’ve written.

Tumblr on the other hand has a very big user base who like to engage with other tumblers. If this blog was on Tumblr and I was using specific tags to describe my posts, I’m sure that I would get a good following and engagement from other people.

I decided to opt for a micro blogging platform for the following reasons:

Easier to get fans/followers and for people to engage with your blog

By joining a network with millions of users, it is easier to attract people interested in your content to become loyal to your blog. Take for example Facebook users, they are very willing to like and comment as long as they stay on Facebook.

No need to worry about disk space (can upload any amount of images at will)

Using a hosted CMS means that you need to monitor how much disk space you’re using. I usually optimise images before putting them on the web for 2 reasons – one to reduce the size so that it does not occupy more disk space than it should on my server and second because that decreases page load time and increase page speed. With an account on Tumblr or Posterous, I don’t need to worry about these things, although that’s recommended. I can now put images in blog posts as much as I want no matter how silly they are..

Can write as minimum as I want without worrying about SEO

As an SEO person, you know that you need to produce content of great quality to rank well in search engines. It makes sense but these type of content is difficult to produce and I’ve noticed that I’ve become less productive as a result. I no longer can write a quick post or have a post with just an image. People say that an image speak a thousand words but if you want organic traffic, you need to write that 1000 words along with the image to describe it.

However looking at Tumblr for instance, it seems that it’s not necessary for a post to be of a certain length (minimum number of words etc). You can have a short phrase, an image or anything else and have lots of comments, likes etc as long as your posts is seen by many people and is appealing. It’s all to do with the fans that you have. Some people can write a posts of a 200 words and have 50 comments on it. That solves the problem of SEO, right? The UGC, user generated content?

Tumblr is better because

  • there are more users than posterous
  • the users like to engage, like your content, reblog your posts (maybe because they are the younger generation and posterous users are oldies, above 55 I think the majority are)
  • there are more themes to choose from
  • it’s easier to use (from signing up to blogging, tumblr is much much simple than posterous)

The only thing I don’t like about tumblr is that you cannot get contributors to your main site. Oh well, one vote down against so many thumbs up…

Why my rankings dropped because of the Google Panda update

This website is a personal blogging space for me where I write mostly on SEO & Programming. I also tend to write about things which I’d like to remember, whether personal things or something that I think would be useful (reference) in the future. What I’m trying to say is that I shouldn’t have to worry whether people come on my site to read the stuff I blog about or whether Google sends me traffic because it thinks I have good content that addresses the queries of other users. However all that changed when they released the Google Panda algorithm. Not exactly since the beginning which was April 2011 but sometime in March 2012.

I was getting a steady flow of visitors to my site because the number of posts on the site was increasing and I’ve been doing this for at least the last 5 years. I’m in no way a regular blogger but I tend to write especially when I struggle to find a solution to programming problems so that I can reference it later and to help other people out there. It’s a good feeling when you see people coming onto your site and it makes you feel appreciated (even better when they leave nice comments!).

As I said before, I lost rankings on my site was a few keywords (some blog posts have stopped ranking where they were) because of the penalty imposed by the Panda Update. I know it’s that particular algoritmic update, not Penguin, because I’ve cross referenced the dates. Traffic to my website is now one-fifth of what it used to be. This affects just my pride to be honest because I do not make any money from this website (I’ve kept it ads-free since its birth). It just makes you feel sad that your once prized content has now lost all its merit.

I’ve never bought any links for this site (why would I anyway, it’s not like boosting my SERPs will make me more rich) and the backlinks to it are purely editorial. Some people have found my website (blog posts) helpful and decided to cite me on their site (or forum). I write from my personal experience so it cannot be duplicate/rehashed content. It’s all unique and original!

So why was I penalised?

After reviewing my blog, I found lots of thin content on the site. Sometimes I would write a small paragraph in a blog post because that’s what was needed. I am not writing as a researcher or to provide a complete and accurate information for someone else. I am writing for myself, for my own use primarily, and even a short note is enough to describe something fully for my needs.

However this is not how Panda works. It wants the best possible information regarding a certain topic. Obviously a couple of lines is not enough for them and you cannot judge quality from those. From what I gather, you need to exceed a threshold of low quality pages compared to quality ones before you get trapped in the Panda penalty.

There are 2 reasons why I was not penalised as soon as Panda was out. The first theory is that I did not have as many thin pages then to reach the limit of sinking in on the balance. As more shorter blog posts were introduced, the overall quality of my blog decreased and thus I got pandalised. The second reason is the crawling rate of my site. As I said before, I do not blog often and therefore it could be that Panda caught me out much much later because Google does not crawl my blog often.

My personal opinion about it all

On one hand, the whole concept of Panda makes sense because it takes out the weed from the garden. You will get less sites with inadequate results when you search on Google and that’s a good thing. On the other hand, it makes me frustrated that I have to change the way I blog if I want to be on the good side of Google.

Of course, I don’t need to please Google; it’s my website and I can do whatever I want with it. Nevertheless, blogging solely for your personal satisfaction is boring because you don’t get the interaction with other people. You do not want to become a lonely person on the web which has billions of users. Therefore I’ve decided to do something about it.

Fortunately, there was an easy option for me. I could mark all thin posts as “noindex” so that Google can ignore them. I did not want to unpublish the posts but wanted a way to strike a nice balance and that suits me. All I had to do was go through all the posts on the site and mark the ones with very low word count as noindex.

I’m quite certain that the changes will bring about a positive change next time Panda runs (well actually, after Google has recrawled all of my pages and Panda is updated). I just need to sit back and wait now…zZzZzZzzz

UPDATE : So Panda updated itself at least 3 times after I wrote this blog post but there has been no positive effect on my site. The Panda algo is a much bigger beast than I originally thought and you need to significantly improve the content of your webpages to get out of the penalty which is not something I have time for (especially for this personal website).

Using PHPList to send bulk emails for newsletter

Instead of developing a newsletter system, I thought it would be easier to find an already made solution which I could use with minimal effort. PHPList was the first when it comes to this sort of functionality. However I was not very much pleased to have all the emails I send with an image of phplist at the bottom, not because I did not want to give credit to the developers but only because I thought it would be considered paid advertising in my newsletter.

I thought that I needed a very basic system in place. I already have the email address of my subscribers, I just need to mass email them and give them the option to unsubscribe as well. It sounds easy but then again, I didn’t want a rushed solution, so I decided to go ahead with PHPList but opting for a text link instead of the image credit to the author.

I installed PHPList through Fantastico on Linux CPanel and I’m sure you can install it with Softaculous as well. When the installation was complete, I quickly navigated to the interface but was met with a deadend. These are the problems I encountered:

500 internal server error

To fix this problem, I had to remove the line for php magic quotes from the .htaccess file. You can uncomment it as well by placing the hash symbol (#) at the beginning of the line as follows:

#php_flag magic_quotes_gpc on

Could not load xml for FCK Editor

When going into messages to compose a newsletter, I couldn’t type anything into the rich text box (FCK Editor) because there was a problem with loading the xml for that file. I had to modify the htaccess file to include the fckstyles.xml file in the allowed list as follows:

<FilesMatch “(index.php|dl.php|ut.php|lt.php|download.php|fckstyles.xml)$”>
Order allow,deny
allow from all
</FilesMatch>

Users are set to receive TEXT instead of HTML by default when importing

If the users in your lists are set to receive text, your html newsletter will be converted to text when sent to them. This is really a bad thing especially if you know the users can receive html in their email. You can go to your main page on phplist where you’ll find the option to reconcile users and from there, you can “mark all users to receive html”. You can also add a column to your file (CSV) with the name “Send this user HTML emails” and a value of 1 so that all imported users are default to receive html emails.

Cron jobs not working

With the command line cron job, the queue was not being processed because a login username and password was required. So I was getting emails to enter the login in. Mind you that the emails were text based and I had to copy the source to a new file, save as html, to view it properly. Anyway, I fixed that problem by using cURL and providing the username and password as follows:

curl ‘http://www.domain.com/phplist/admin/index.php?page=processqueue&login=username&password=password’

My cron job runs every 10 mins as I have defined a batch mode of 20 emails every 10 mins because I’m on a shared server which only allows 150 emails per hour. I decided to go safe with 120 per hour.

And to get the cron job to not email me every time it runs, I had to change the output to this:

curl ‘http://www.domain.com/phplist/admin/index.php?page=processqueue&login=username&password=password’ > /dev/null 2>&1

The 2>&1 is required because it tells linux not to print any error messages as well (so no mailing of any sort to you).

Number 0 is for standard input

Number 1 for standard output

Number 2 for errors

Anyway, so far so good. Let’s if the newsletter brings in more traffic!

How to map another file extension to the php parser through htaccess

I was in the process of cleaning up my web hosting accounts when I came across one plan which had 2 sites on it. One of them was a static website (pure html) and the other one was written in classic asp. So I decided to move both sites to another hosting plan because it was just a waste of money as the sites did not have much traffic and there was no justification why the hosting was needed.

Since windows hosting plans are more expensive, I wanted to move the sites to a linux server and I knew I wouldn’t have any problems with the html website but I wondered how much trouble it would be to move the classic asp website. The code for the dynamic website was not using a lot of asp code as in only the files ended with an .asp extension but a php parser could very well serve the webpage. So after much googling, I found that adding a line to the .htaccess file would map any file extension to the php and that would do the trick for me.

Mapping asp to php parser

The line below will tell the php parser to treat files ending with .asp extension as if they were a php file.

AddType application/x-httpd-php .asp

How the above line did not work for me and I had a lot of problems trying to map the classic asp code before I found the solution. It has something to do with the version of php you’re using. Mine was 5.12.53 I think. I added the following lines in my .htaccess file and it worked:

AddType application/x-httpd-php4 .asp
AddType application/x-httpd-php5 .asp

Both lines are required. There’s something else that you need to do if you can’t get it to work. You can replace AddType with AddHandler (that depends if you’re running Apache as a module or CGI script). And sometimes you might need to remove the application word and have only this:

AddHandler x-httpd-php5 .asp

If you wanted to map another extension, you would do this:

AddType application/x-httpd-php5 .html .htm .me .newextension

Files ending with .html, .htm, .me and .newextension will then be run through the php parser and it would allow you to serve dynamic content through .html files or hide (disguise) the programming language you’re using on your website or make it easy to move from one coding language to another because if you don’t use a file extension reserved for a particular programming language, then you can easily switch to another programming language or server.

While mapping another extension, I also did two other things:

Define a default document (so that index.asp would be recognised as a default webpage for the folder/directory)

DirectoryIndex index.html index.htm index.php index.asp

Disable directory browsing (so that folders/directories without a default webpage are not listed)

Options -Indexes

Well so far so good, now just waiting for DNS propagation to complete before I can cancel the hosting plan.

 

Fixing the problem with Static Compression Module frequentHitThreshold

While trying to speed up my website, I struggled with getting the StaticCompressionModule to work on my local machine. By default, the static compression module is enabled so you would think that all your static files like javascript and css will be gzipped before being sent to your browser. However this is not necessarily the case as I found out with Chrome Developer Tools and the Firefox YSlow plugin.

Although the module is enabled, static files are only going to be compressed if they meet a certain criteria determined by the frequency of access controlled by two parameters (frequentHitThreshold and frequentHitTimePeriod). The first parameter is set to 2 and the second one to 10 seconds by default. This means that if a period of 10s, if you get 2 request for your static files then they will be sent as gzip. However this is not what I wanted and I needed to correct this behaviour.

It is possible to change the values for these parameters but when I put it in the web.config file, I got the following error:

This configuration section cannot be used at this path. This happens when the section is locked at a parent level. Locking is either by default (overrideModeDefault=”Deny”), or set explicitly by a location tag with overrideMode=”Deny” or the legacy allowOverride=”false”.

I googled the problem and thought I’d give the command line a try:

%windir%\system32\inetsrv\appcmd.exe set config -section:system.webServer/serverRuntime -frequentHitThreshold:1

Substitute your windir accordingly.

That worked great but I wanted to know why the web.config gave an error and it turned out that the config needed to be within a location tag  as follows:


<location path="Default Website">
 <system.webServer>
 <serverRuntime enabled="true" frequentHitThreshold="1" frequentHitTimePeriod="00:00:10" />
 </system.webServer>
 </location>

Once the value of the frequentHitThreshold was changed to 1, I was able to see the content compressed before being sent out from the server to the browser. Sweet!