Hopefully someone will be able to make good use of it.
Posts Tagged ‘automation’
Link roundups are legitimate blog posts, right? It doesn’t matter because I’m going to post it anyhow.
We’ll start with programming-related topics:
- Michael Nachbaur writes about automating iPhone app builds with Hudson. Of course this idea is particularly appealing to me since I love teh Hudson (even when it gives me fits ).
- The guys at Softwareschneiderei GmbH have an interesting series about speeding up build boxes by upgrading hardware. Spoiler alert: CPU is a big deal.
- Want to pair program and concerned about productivity? There’s a bonus video from the 2009 Brazilian Music Institute.
- Paul Graham lists 19 start-up surprises. Apparently being a start-up can be fun if you like roller coasters.
Next off to the bizarre sciency stuff:
- Ever wonder what the results of a fMRI would be on a dead salmon? Me either, but the results are more interesting than you’d think.
What is more off-topic than dead salmon? These links:
- You should be playing Machinarium. It’s a flash-based (read: works on Windows, Mac, and Linux) point-and-click puzzle/adventure game. It’s worth the $20.
- Get in the Halloween spirit with geeky pumpkin carvings.
- Steve Russo recalls some of the top Walt Disney World failures. The Cakestle is truly tragic.
We ran into an interesting and less than informative error when configuring Maven with our Hudson installation. Maven worked great, as expected, but the Sonar plugin stopped working and were causing builds to fail.
The error message wasn’t terribly helpful:
A little Googling, I found just two hits.
One result was helpful in which it said the Sonar plugin is compatible with Hudson 1.306+. Currently we’re running 1.303. We’re not exactly far behind, but apparently far enough behind.
Backing up Hudson
While there is a backup plugin for Hudson. The plugin would be ideal, but in case just installing the plugin screws something up, best do a manual backup.
The easiest way to manually backup Hudson is to just copy your Hudson working directory. However, space is limited for us, so a backup that was more selected was necessary. This script seemed to backup the most important configuration files (it wouldn’t make for a pretty recovery, but it’d work):
#/bin/sh cp $HUDSON_HOME/*.jar $NEWHUDSON_HOME/ cp $HUDSON_HOME/*.xml $NEWHUDSON_HOME/ cp -r $HUDSON_HOME/plugins $NEWHUDSON_HOME for job in $HUDSON_HOME/jobs/*; do echo "Processing $job" mkdir -p "$NEWHUDSON_HOME/jobs/$job" cp "$job/config.xml" "$NEWHUDSON_HOME/jobs/$job/config.xml" done
All aboard the fail boat
The upgrade appeared to go well, but after manually starting the Windows service, I get an error. Amusingly, the hudson.err.log showed some slight inconsistencies:
Jul 27, 2009 2:38:03 PM hudson.model.UpdateCenter$DownloadJob run
INFO: Installation successful: hudson.war
Invalid or corrupt jarfile C:\hudson\hudson.war
Hudson the Butler can’t make up his mind; it’s claiming success before imploding on itself.
Skimming around and very annoyed my butler had blatantly lied to me, I noticed hudson.war was sitting at only 2 MB. Yah, that can’t be right.
Luckily the fix was easy: Hudson was successfully fixed by manually downloading the newest hudson.war and replacing the messed up version.
It turns out I really did not need to backup Hudson. Though naturally if I hadn’t backed up Hudson I will have needed my backup!
Upgrading to Hudson 1.317 solved the mysterious java.lang.NoSuchMethodError error. I would not have thought configuring a Maven installation rather than using Hudson’s default would cause issues. Go figure.
Every time you break the build, God kills a kitten. Please think of the kittens.
I’ve noticed for a while our unit tests run very fast through my IDE but take forever on our build box. At first I attributed this to our severely overloaded build box, but I was wrong.
In our particular case the tests take 5 minutes 3 seconds to run through Ant while forking for each test class. Very not cool.
If you’re using fork to run your JUnit tests in Ant, there are two attributes to be concerned with: fork and forkmode.
The forkmode attribute is the big one here. Possible values are “perTest” (default), “perBatch”, and “once”.
It turns out that because “perTest” is the default; meaning forking is done for each test class. While that might be what you want, it can make your tests significantly slower. Using the “once” option instead means forking happens just once; all your tests will be ran together in a single JVM.
After switching the forkmode to “once”, our test running time plummeted to 33 seconds for a net gain of 4.5 minutes. As Borat would say, “very niiiice”. (In this case 33 seconds is still longer than it should be, but that’s for another post).
There’s some very good technical reasons to run all your tests in the same JVM. As Chris at nominet.org.uk pointed out, it might reveal some issues with your production code or test code:
Having the tests run each in its own JVM also covers up problems. You can create whatever sort of mess you like and it’ll all be swept away before the next test runs. While going through a setting the forkmode to “once” I found a database connection leak in some test code. This sort of problem would be much more visible if all the test code was running in a single JVM and reminded me of Martin Fowler’s advice on testing resource pools.
Of course Martin Fowler comes through again with good advice.
We all know that backing up critical data is very important. Unfortunately, it’s not always very clear how to go about doing that.
Backing up all databases
Backing up all databases is as easy as as this:
mysqldump -u username -p password –all-databases -c > mysqldump.sql
Did you expect it to be much harder? Sorry to disappoint.
The –all-databases option backs up all databases (genius naming!). The –complete-insert or -c uses complete insert statements in the backup script.
With the above command, mysqldump outputs all the SQL needed to regenerate all databases from scratch with all the data from the time of the dump. Users and other MySQL tables are also included in the dump.
Backing up a specific database or table
Sometimes it’s useful to only backup a small portion of the MySQL instance, either an entire database or a specific set of tables.
Not surprisingly, you can backup a specific database with mysqldump:
mysqldump -u username -p password YourDatabase > yourdatabase.sql
It’s just as easy to backup one or more tables in a database:
mysqldump -u username -p password YourDatabase table1 table2 > yourdatabase.sql
Compressing the backup
If you have a lot of data, compression with gzip will be helpful to eliminate disk space waste:
mysqldump -u username –password=password –all-databases -c | gzip -9 > mysqldump.sql.gz
If you’re dealing with a database of any significant importance, it’s critical to automate the backup process. In the Unix world there is cron. I won’t go into specifics on how to use cron since it is easily worthy of its own posting.
If you ever need your backup, it’ll suck. With backups, it’ll suck less. With automated backups, it’ll suck much much less.
mysqldump isn’t the right tool for all your MySQL backup needs, but its ease of use gives you no excuse. More complicated backup techniques, including incremental backups, can be found in the MySQL reference guide.