Monthly Archives: July 2008

Climate model on a mac project: #14 Knowing when you quit…

When not using a scheduler like Torque/PBS, it can be complicated to find out whether the model has quit. If the run was successful, you can have a reasonable idea when it SHOULD quit, but it might crash long before that time. As a result, you've lost hours, if not days, of computing time.

One solution to this problem is a handy application that comes with MacOS X: Automator. Automator is a simple way of getting the computer to do some repetitive tasks. In this case, the tasks are 1) run the model, 2) open Apple Mail, and 3) send an email. Automator makes these steps easy. The pre-requisite, however, that you have a mail account already set up in Apple Mail.

For step 1, you need the “Run Shell Script” action. Simply write the shell script to cd to the working direction and execute the run script.

For step 2, you need the “New Mail Message” action. Set the “To:” address, the subject, and message body to something meaningful. Also be sure you've selected the proper sending account at the bottom.

For step 3, all you need to do is actually have Mail send the message. To do this, you simply use the “Send Outgoing Messages” action.

To get this all to run; press play.

That's it!

Climate model on a mac project: #13 Post production!

I've managed to complete a 100 year simulation for the Carboniferous. Over this span, the computer performance was, on average, 13.78 model years per day with a standard deviation of 0.13 model years per day using all 8 cores of the machine. I can't explain the large standard deviation. It is likely a combination of the computing activity of the software, hard drive interaction, the machine needing to do other work during the run, and heat. However, there is no correlation between run speed and duration. So, a direct connection between heat (using duration as a proxy, i.e. the longer the run, the hotter the machine) and run speed doesn't seem to exist. On the other hand, the longest duration run, over 6000 days, produced the slowest run speed, 13.5 model years per day. However, there is only one sample with such a long duration.

Now comes the post processing. I use a wide variety of “in-house” software to process the results and turn them into something manageable. I also use some free and open source software. One package is NCO (NetCDF Operators). Since I'm on a mac, there are at least three ways to get Unix/Linux apps on the machine. 1) Fink. a package manager, 2) Macports, another package manager, and 3) compiling, i.e. the hard way.

I've had no luck installing NCO using Fink or Macports. However, I used Macports to install some of the dependencies for NCO, such as ANTLR. So, compiling I go!

One thing I've found to be extremely handy when plowing through a difficult compile is to first build a script that you can set all your variables, switches, etc. so you keep a record of what you used.

Here's what my script ended up looking like:

#/bin/sh

make distclean

export CC=cc
export ANTLR_ROOT=/opt/local
export CPPFLAGS=-I/usr/include/malloc
export NETCDF_INC=/opt/netcdf/gcc/4.0.1/include
export NETCDF_LIB=/opt/netcdf/gcc/4.0.1/lib

./configure –disable-regex –disable-antlr –disable-shared

make

The malloc in the CPPFLAGS is there because many files could not be build without knowing exactly where malloc.h was located on the mac.

In any case, I only needed about one or two of the actual NCO commands.

Now, onto NCL (NCAR Command Language). NCL comes prebuilt for the mac, just install and go. However, it can be fickle. Today, it's just hanging on some files. We'll see what I can do to shake things loose.