Examples vs. Tutorials

I’ve been writing software since my first computer in the 1980’s (a commodore 64). Most of that time, I’ve written software for myself, but I did release my first app to the public in the 1990’s. Over all that time, I’ve learned and forgotten quite a bit about computers, computer languages, and how to write code. One thing I’ve never been able to adequately express is how I learn all this stuff. Knowing how you learn something is important because it should inform you how best to tackle something new.

Hit with a code signing fail?

One of the frustrating things about developing for maOS and iOS is code signing. Code signing, to put extremely simply, is a way to make sure that the apps we submit to Apple and the apps you buy from the App Store are legitimate. But sometimes this process can go wrong, or at least seem to go wrong.

Returning to An Old Project: Part 2

Returning to An Old Project: Part 2

iOS Size Classes, Navigation Bar Titles, UI Testing

I’m working on a particularly difficult part of my user interface and I ran into an interesting and confusing problem.  UI Testing is a new feature in Xcode.  Despite its newness, it has already helped save time in development.  However, I started using it with an app I’ve been trying to update for a while when I ran into this problem.

Why you need the hardware to develop for mobile...

Probably the second biggest roadblock for developing mobile apps is the cost of hardware (the first, of course, is the time).  Devices are expensive and it’s tough to keep up.  My current testing livery of devices is actually quite limited.  Indeed, some of my devices are not longer supported by their manufacturers or myself.

Working on migrating AE to OpenGL ES 2.0

Now that I have a break from other duties, I’ve been working on improving AE for iOS.  One of my early design decisions I made in AE is to go with OpenGL ES 1.1.  At the time, my only iOS device did not support OpenGL 2.0, so for practicality sake, 1.1 made sense.  Furthermore, I needed some basic functionality that was stripped from OpenGL ES 2.0.  ES 2.0 was clearly unattractive… at the time…

Unexpected problem with the App Store - App icons

Migrating to XCode 5 has largely been very helpful.  I’ve used many of the new features and I’m quite happy how things are working out.

Redesigning my OpenGL engine for Ancient Earth

In case you were wondering about any updates to Ancient Earth, I’m working on them.  Right now, I’m spending a good deal of time trying to modernize my OpenGL engine to take advantage of Apple’s updates in iOS 5 (yes, that means, I’m dropping support for iOS4).  It turns out that Ancient Earth really pushes the limits for iOS, and for us to be able to add more data and map types, we need not only a more efficient rendering system, but a more flexible rendering system.

Ancient Earth Released for iOS!

Last month, I released by iOS app “Ancient Earth: Breakup of Pangea”. Working with C.R. Scotese, we’ve brought his maps to the iOS platform! In this app, you can explore his continental plate reconstructions for the last 200 million years of Earth history.

Reading SQL with PySqlite

About a year or so ago, I wrote a special script to run the FOAM climate model. The primary goal of this script, besides running the model, was to store a wide variety of information about the run, including settings, system information (like CPU temperature), and the timing and duration of the run. The storing process stored some of the information before the model starts and after the model ends. It’s a great log of my model run and system performance history.

Finishing up DLD 2.0

It’s been a long time coming, but I’m finishing up DLD 2.0 for iOS. I hope to submit next week.

Migrating the Devonian Lithological Database to a Fully Relational System: The Story So Far

The Devonian Lithological Database (DLD for short) is a database I published as part of my PhD work at the University of Arizona. As databases go, it was quite primitive but it got the job done. Over the past year or so, I’ve been migrating the database to a more modern SQL format using SQLite. SQLite is a public domain database designed to work without a server. It is easy to use (for a SQL database) and the data file is generally cross platform.

A Case for Spatially-Enabled Reference Databases

In my latest video, I try to make the case that spatially-enabled reference databases are very powerful tools. Thus, I show how such a reference database can be used to work with developing geological databases like the Devonian Lithological Database.

Why should scientific papers be "spatially enabled"

Now that I’m starting to build the databases needed for my new lithological database, I’m coming back to how I created my Devonian database.  The papers I generally worked with contained reports from the field, including lithology, measurements, location, etc.  That can be a LOT of information.  Collecting it all from each paper is time consuming to say the least.  Howevever, there was another problem…

Developing a new lithological database: Can I do it better this time?

It’s now over 10 years since I published the Devonian Lithological Database as part of my PhD thesis. Clearly, it’s not perfect or even what I can consider “finished”, but I’m proud of the work anyway. The data I collected have been used by oil companies and incorporated into newer and bigger databases. I hope people will still find it useful for years to come.

Reading OM3 files with Photoshop

Visualizing initial condition files can sometimes be problematic. OM3 files, for example, are straight binary data files on 16 bit data. However, these files can be quickly visualized with Adobe Photoshop. View this tutorial to find out how:

FOAM Output Variables

Since I get many questions on what’s contained in FOAM output, here’s a list of all the variables contained in the standard atmosphere, coupler, and ocean output files.

Using SQLite and Python to Store Model Metadata

As I continue to run a range of climate models, I’ve learned from painful lessons that I need to record as much information about the model run as possible. When I first started this process, I simply kept files used to make the run (the geography and configuration files for the model) and the model output. At first, this seemed sufficient because, in the end, these were the data that were most important. As it turns out, however, that having a history of everything you did during the model run, such as adjustments to the settings or geography, is also important both historically to the run and possibly sorting out problems later.

Trying to go paperless

Almost since the inception of personal computing, the paperless office has been an unrealized promise. I’ve been wanting to move in that direction for many years, but it still not easy to do. Today, my office and basement are filled to capacity with documents, scientific papers, and books. I’m out of space and managing everything is killing my productivity. I throw away and give away what I can, but that is only a small dent in the problem. Some stuff I’ll always have to keep as paper, sadly. The remaining stuff, however, needs to move to the computer.

Climate Model on a Mac: Snow Leopard

Snow Leopard, the latest OS offering from Apple, promised to be both 64-bit and faster. The question is whether Apple delivered those promises and whether those improvements impact modeling.

Snow Leopard Services to Geotag iPhoto Images

In one of the updates to iPhoto 8, Apple added the capability to use Applescript to geotag photos - that is, embed latitude, longitude, altitude, and other geographic information. Adam Burt noticed this update and suggested a simple Applescript to geotag a photo with a specific latitude and longitude (see here).

More Snow Leopard Geotagging With Services: Google Earth

As mentioned before, I feel geotagging is an important part of image metadata. In a previous post, I showed a simple Applescript-based Snow Leopard service to set images to a commonly used location.

Adding Climate Model Content To Site

As part of the ongoing efforts to make simulations run by PaleoTerra more useful, we are updating the site to list all climate models run by PaleoTerra and to start making the climate model reports, images, and animations available online to clients.

Climate Model on a Mac #15: Watch those dynamic libraries!

I recently upgraded my PGI compiler to version 8, and I had tons of trouble getting the climate model to compile and run. In this case, I decided to switch from mpich 1.2.7 to OpenMPI, on the hopes it would be better for the system and easier to set up.

My Drobo: A few months in...

Well, I’ve had my drobo for a while now. I really have no complaints on the hardware; it works as expected and works well. I do have issues, however.

Revamping my data protection plan

Revamping might not be the right word, since I don’t have a written plan, but I’m at least re-evaluating what I do to protect my data.

Refactoring your code!

My current programming project is Objective-C application for MacOSX to generate climate model results in the form of a web site and a PDF. It’s actually a complete write of an existing application.

The Drobo so far

With the Drobo up and running, I’m feeling a little safer. Right now, I have about 2 TB of disk space, which gives me just under 1 TB of available disk space. Of that space, I’ve used about half.

Another test of my backup strategy

It happened again! I had a hard drive crash. This time, the drive was a Seagate 1 TB drive about 60% full.

Whither MacOS 9 Classic: Time to update my data

The following was a post I started a while ago and I’m not sure if I finished it. So, here’s a quick re-write…

Climate model on a mac project: #14 Knowing when you quit...

When not using a scheduler like Torque/PBS, it can be complicated to find out whether the model has quit. If the run was successful, you can have a reasonable idea when it SHOULD quit, but it might crash long before that time. As a result, you’ve lost hours, if not days, of computing time.

Climate model on a mac project: #13 Post production!

I’ve managed to complete a 100 year simulation for the Carboniferous. Over this span, the computer performance was, on average, 13.78 model years per day with a standard deviation of 0.13 model years per day using all 8 cores of the machine. I can’t explain the large standard deviation. It is likely a combination of the computing activity of the software, hard drive interaction, the machine needing to do other work during the run, and heat. However, there is no correlation between run speed and duration. So, a direct connection between heat (using duration as a proxy, i.e. the longer the run, the hotter the machine) and run speed doesn’t seem to exist. On the other hand, the longest duration run, over 6000 days, produced the slowest run speed, 13.5 model years per day. However, there is only one sample with such a long duration.

Climate model on a mac project: #12 In Production!

At last, the new machine is now doing production work and I’ve run almost 24 hours so far. First, it’s running a bit slower than anticipated, about 13.5 model years per day. I’m not sure what’s causing the slower speeds, but it’s still an acceptable speed considering my original estimate was only 7 model years per day.

Climate model on a mac project: #11 sysctl.conf and mpich

A key component for running the climate model I’m using is MPICH. MPICH is an implementation of the MPI (Message Passing Interface) library for supercomputing maintained at Argonne National Laboratory (disclosure: I am a former employee of the lab). The climate model uses MPICH to break down the world into smaller parts and distribute those parts to each CPU. However, these parts must also talk to each other because climate information must move from processor to processor so that each processor knows what’s going on around it.

Climate model on a mac project: #10 Rebuilding

Now that the model clearly works using the test license from Portland, it’s time to rebuild the system. So far, I’ve reinstalled MacOS onto the system and installed the developer tools.

Climate model on a mac project: #9 Getting in running in MacOS X

One misgiving about running a climate model on the mac is that I don’t trust linux to handle the Mac’s fans to keep the mac cool. Apparently, there is even more to it. The mac also will throttle memory bandwidth if the ram gets too hot (see technote TN2156). Whether or not a Linux install not targeted to a Mac would handle these issues, I don’t know.

Climate model on a mac Project #8: Memory Upgrade...

The minimum ram requirement for the climate model was 2 gig, which is what came with the computer. However, I decided to get 8 more gig for the system. At first, I left the original 2 gig in the machine (arranged according to the docs). However, this caused an enormous performance hit: the system was 12% slower running the model. Dropping the original 2 gig brought the speed back up, more or less, to my original performance figures. I’m still not sure that this was a memory arrangement problem or a mixed-card-size problem…

Climate model on a mac Project #7: Performance continued

After attempting a slightly longer R15 run on the model, the performance stayed about the same. It runs at approximately 72 model years per day. I was hoping for more, but that’s quite respectable.

Climate Model on a Mac Project #6: Success!

After a long struggle trying to work with various compilers, today, I decided to try the Portland C/Fortran compilers. I’ve had success with them before. Fortunately for me, it worked! I actually got the model running at a low resolution! Tomorrow, I’ll try a high resolution.

Climate Model on a Mac Project #5: Installing additional software

The climate model I’m planning to run is compatible with the g95 compiler, not gfortran. So, first thing was to download and install the g95 binaries.

Climate Model on a Mac Project #4: Installing ROCKS

ROCKS is an NSF (National Science Foundation) supported project to develop and maintain cluster environments so they provide cluster installs for a wide variety of platforms. It contains the usual things, like PBS, ganglia, and many others.

Climate Model on a Mac Project #3: First snag

Yesterday, after bringing the machine to my office, I ran into the first snag. It wouldn’t boot! It would boot to a “blue screen” and die. So, I headed back to the apple store and they tested the machine. Annoyingly, it worked fine for them. It turns out that I was using a cheap 3rd party flatscreen with a DVI to VGA converter. Sadly, this wasn’t good enough for this machine.

Climate Model on a Mac Project #2: Unboxing

I’ve posted unboxing photos at my personal website

Climate Model on a Mac Project #1

Computer clusters can be expensive to buy and expensive to maintain. My current linux cluster has 18 CPUs, and takes up about 13U of rack space in a colocation facility, and uses 17-18 amps of power.

Disaster Recovery

My big fear is losing all my data in some sort of system crash. Well, I just had a system crash last week (bad hard drive). As it turned out, it was marginally painful. My backup system was designed for managing my data. Fortunately, my data were on the drive that survived. The dead drive housed the system OS.

SATA vs. Firewire vs. USB 2.0

I’ve been needing a serious evaluation of my backup habits. One area is mixing my storage media for longer-term backup. Up until now, I’ve been using DVD’s, but given the hundreds of gigabytes of data I can create in a month or two, and questions regarding long-term storage of writable DVD media, I’ve begun delving into the possibility of using hard drives as part of an overall strategy. But the world of external drives has become exceedingly complicated. Firewire (IEEE-1394), USB 2.0, and more recently SATA/ESATA.

Thomas L. Moore

, styled with lin.css