Sunrise was at 06:16 with low tide at 04:43 and a first quarter moon with a moonrise of 14:53 and moonset of 23:54. Skies were very clear and started fishing at 05:45. Fished just south of the peer (a good fisherman could have easily casted to the bridge peer itself) on the east side with an incoming tide.
Ever had a situation where you want to send a file to a friend at work or something, and sending it over email makes you feel all dirty? One way to solve the problem is to copy the file into /var/www/foo or something and send a link... we've all done it. But there's a better way :)
Recently I was in the process of moving my site to a much better hosting situation (more on that later). During the move I decided to upgrade from PostgreSQL-8.0 to PostgreSQL-8.3 as I was pretty far behind and I prefer to stay current. This sort of upgrade isn't a big deal, and I've done it many times. So I did my usual process:
- Install the desired version of PostgreSQL (in this case 8.3)
- Scp my last backup (taken a few minutes after I lock down the site) to the new host
- Run a script that essentially creates a new db, and restores the backup to it
That's when things went horribly wrong. After a few searches I quickly learned that moving from 8.0 to 8.3 is a bit tricky when you have tsearch2 stuff in there. It turned out to be really easy to upgrade, read on if you want to know how I did it.
Git is by far my favorite version control system I've used. I use it all the time, and one of it's benefits is how easy it is to share code. I usually just send someone the link to gitweb and they can look at my code there. Other times people give me their public key, and they can clone my stuff over ssh. But just recently I wanted to have a way to easily allow anonymous readonly access. I was surprised to find that this isn't exactly straightforward on Ubuntu, but here's how I got it working...
When it comes to performance one of the most important considerations is caching of content. There are all sorts of approaches to the caching. Some protect the database from duplicate queries while others protect your application from having to perform an expensive algorithm over and over. Today I am going to talk about the most aggressive form of content caching when it comes to the web - full page caching.
Recently a friend of mine pointed out that I had an error on one of my pages. It took me almost 45 minutes to figure out what was happening. I wasn't able to reproduce the defect in my development environment. The version of Python installed was exactly the same. I tried executing the problematic piece of code on the production server and it did not reproduce the problem. All of my unit tests passed... I was at a loss as to the source of the problem.
It just so happend that a few months ago I had turned on a particular configuration option in Apache that influenced the way Python works. The reason why I wasn't able to reproduce the problem in dev was because it does not use a production configuration. The reason why I wasn't able to reproduce the problem using a Python interpreter is because it doesn't care about how mod_python works. Once I figured out what was wrong the fix was very simple, in fact all I had to change were two letters.
The moral of the story is: Testing is good, unit testing is great, but don't forget to test your configuration :)