Digitizing and Geo-referencing Aeronautical Charts


One of the things that I miss the most about flying in the States after coming back to Canada is the huge number of aviation related phone and tablet apps available in the States, thanks to the Federal Aviation Administration releasing almost all aeronautical products (navigational charts, airport directory, IFR approach plates, etc) into public domain in digital form. There are hundreds of app available to use all those data in just about every imaginable way.

The most famous one is probably SkyVector – a Google Maps-like site that offers a very easy to use flight planning interface based on FAA charts.

However, I believe the single most useful application is in-flight moving map. If you have a phone or tablet with GPS, you can have a fancy handheld map thing with a little airplane symbol to tell you where you are, overlaid on the appropriate navigation chart, in relation to all the airports and navaids around you.

That is great because it significantly increases the pilots’ situational awareness, especially when flying to unfamiliar places or over featureless terrain. Of course, it can’t be relied upon as a primary mean of navigation, but extra safety never hurts, and knowing precisely where one is definitely increases safety. Imagine having a high altitude engine failure at night. Wouldn’t it be nice to be able to fly directly towards an airport that you can’t see yet because it’s dark, rather than picking a random direction to descend into and hoping terrain is relatively flat at the end of the descent? I suppose you can try to figure out where you are by tuning a bunch of radios, plotting it on the chart, drawing a line to the nearest airport, and measuring a heading to fly… by which time you are probably on the ground already, in one or more pieces.

Unfortunately, none of that is available in Canada. In Canada, aeronautical publications have been delegated to a private company called NavCanada, which has no interest in releasing digital publications, probably so they can sell as many paper charts as possible (which, by the way, are much more expensive than their American counterparts). All the paper charts are also copyright protected, and cannot be redistributed. It’s sad to see that even aviation safety is not a good enough reason for them to give up some of that commercial interest.

The official statement is that they are “looking into” how to distribute charts digitally. They have been saying that for at least 3 years now. Exactly how difficult is it to release a few images that they already have? Disgusting.

I’m tired of waiting, so I made my own. Scanned my copy of Vancouver VFR Terminal Area Chart, and geo-referenced it for use with a moving-map app called Naviator.

Naviator is a paid app ($15 one time, or $35/year for access to FAA charts). I highly recommend it, though. The best out of a few map apps I tried. Also gives graphical TFRs, NEXRAD (precipitation), terrain, and satellite image overlays.

Unfortunately I can’t distribute the chart I made online because of copyright. I can write about how to make it though. I followed mostly instructions on the Naviator forum, but there were a few unclear places that required a bit of experimenting to figure out.

1. Get the chart scanned. Cutting it up, scanning it, and stitching files back together is one option, but it’s a lot of work, and you lose a map (expensive). I just got mine scanned at a copy store. If you are in the Vancouver area, UBC Copiesmart can do large format scanning. I think I paid around $10 for it. If you have a high end camera, taking a picture (with good lighting) and doing perspective/lens correction may be an option, too.

2. Install Quantum GIS. Open it, but don’t load the map into the main app. Just go to Raster->Georeferencer->Georeferencer. Then load the map (Open raster). For coordinate reference system, choose “WGS 84/UTM zone 10N” (or the appropriate one for your map – UTM is Universal Transverse Mercator, and zone 10 is the west coast). Then it’s time to add points!

3. Now the tricky part – to add a reference point, zoom in on one of the intersections of the map grids on the chart, click, and type in the coordinates for that point. Be careful with the coordinate representation. The VTA grid is given in degrees and seconds, instead of decimals. Degrees and seconds should be separated by space instead of decimal point. Also note that North America is on the west of the Prime Meridian, so by convention, longitudes should be negative.

For example, the intersection to the top left of YVR is at 123° 15′ W and 49° 15’N. That should be entered as either x = “-123 15” and y = “49 15”, or x = “-123.25” and y = “49.25”. They are equivalent.

4. Repeat 3) for 6-8 intersections on the map. Try to put more points around areas you are most interested in, since the georeferencing will be most accurate around those points. Make sure at least 3 latitudes and 3 longitudes are used in the points.

5. Click “start georeferencing” (green triangle on top).

Transformation type… I’m not really sure. I get good results with Polynomial 2, but it requires quite a few reference points.

Resampling method = cubic.

Compression = NONE.

Output raster = where to store the output file (the program will output a tiff with geo-ref information built in)

Target SRS = “WGS 84 / UTM zone 10N”

Click OK. After waiting a few minutes, it will be done, and if you look at the GCP table, it will tell you what the errors (residual) are for each reference point. For me, all of them are < 2 pixels. If you have a point with much higher error than others, that point is probably entered wrong.

Also, open up the produced geotiff file with any image viewer, and do a sanity check that all longitude and latitude lines are vertical and horizontal respectively. If they aren’t, you probably didn’t use at least 3 longitudes and 3 latitudes when defining the points.

Optional: To combine multiple maps, install OSGeo4W, edit the source images so that parts that should be transparent are set to black, re-georeference, and use gdalwarp to combine them. Then use gdal2tiles to tile it (MapTiler is a graphical frontend to gdal2tiles).

gdalwarp -r lanczos -srcnodata “000000” sideB_geo.tif sideA_geo.tif vta_b
ackground_geo.tif vta_geo.tif combined.tif

Layers will be added in the order given, then the resulting tif can be tiled normally.

6. Install and open up MapTiler.

Tile profile: “Google Maps compatible (Spherical Mercator)”

Source data files – add the generated tiff file.

SRS: “WGS84” (not sure why UTM doesn’t work here)

Minimum zoom: “4”

Maximum zoom: “12”

File format: “Hybrid JPEG+PNG – only for Google Earth”

Choose destination folder, no viewers, choose a name, Render, wait a very long time.

7. Download and open up Naviator Chart Packager. Load up the directory created by MapTiler. It should generate a .chart file.

8. Copy the file to your device’s /Naviator/Charts/

9. Happy flying and don’t crash!

Since Android is not certified for primary navigation use, be sure to also have paper charts available, and always know where you are and be able to navigate without it. As we all know, phones and tablets do fail or run out of battery pretty often. GPS can also fail (happened to me once). In rare occasions (also happen to me), GPS can also give you wrong positions. So always be prepared! GPS apps really can’t replace high reliability aviation GPSes, and are definitely not good (both legally and practically) for IFR.

Human Sexuality – Evolutionary Psychology

In humans, Darwin made men and women to adopt very different mating strategies –

The female reproductive strategy one that favours quality over quantity, whereas the male reproductive strategy focuses on quantity over quality. That’s surprising because for humans to reproduce, men and women have to work together. How can they work together when their strategies are so different, and how did it happen in the first place?

From an evolutionary standpoint, to be successful individual means to be able to pass on one’s genes through mating, and having a successful offspring – making gazillion babies and have them all die within a week is not very effective.

And for men and women, that requires very different strategies. For men, sperm cells are cheap and plentiful, so the optimal strategy is to impregnate as many women as possible, put in no commitment, and hope some of them will survive. If there is an infinite supply of women willing to have sex (for example, if you are the Khan), it will work. Men are perfectly capable of impregnating at least a women a day, and by chance, some of them WILL survive to reproduce. Genghis Khan, for example, had at least hundreds of children.

On the other hand, for women, that would be a very crappy reproductive strategy, because of 9 months pregnancies. Women can’t just go have sex with every man she sees, because once she gets pregnant, she can’t mate again for 9 months, even if a better guy comes along (better meaning having genes that are more likely to make offspring successful). They have to be choosy and only have sex with men that she deems worthy. They want men that are willing to put in a lot of commitment to help raising the children – unfortunately that directly conflicts with the male strategy.

In the real world, compromises are made between men and women so that we can actually make babies so humans don’t go extinct. However, the fundamental differences still shine through in our modern society –

  • Females are choosier than males. Males have to compete to get access to ANY female, while females only have to compete for “high quality” males, sometimes. This is true for all species that have an uneven division of sacrifices between the sexes – eg. male seahorses are responsible for carrying baby seahorses, so they get to be choosy, and have females compete for them.
  • Females are more concerned about their partners becoming emotionally attached to another female, while males are more concerned about their partners having sex with another male – paternity confidence. Males need to make sure they are not wasting energy raising someone else’s child, and females don’t usually need to worry about that. On the other hand, females need to worry about their partner getting pulled into another relationship, and stop committing to this one.
  • Males like variety. Porn, prostitution, extra-marital affairs, sexual fantasies, etc. Yes, there are male porn actors, but even then, their target audience is usually homosexual men rather than heterosexual women.
  • Males are very willing to have sex, even with strangers. Most females aren’t.

Just thought that’s very interesting.

BTW, this is all from Dr. Paul Wehr’s PSYC 208. If you are staying at UBC for at least another year, you should take it!!

New Backup Strategy – Amazon Glacier

I recently learned about Amazon Glacier – a storage service optimized for reliable very long term storage and infrequent data retrieval.

Storage is very cheap at $0.01/GB/month, but it is unlike conventional online storage services in a few ways –

  • Uploads are free
  • Downloads are free for 5% of your data per month (eg. if you store 100GB, you can download 5GB per month). Additional downloading is pretty expensive ($0.12/GB).
  • Downloads have a ~4 hours latency.
  • Designed “to provide average annual durability of 99.999999999% for an archive”

These properties make Amazon Glacier an attractive option for offsite backup!

I’ve always wanted to do offsite backup of my data, but never really found a good way/place.

I have plenty of backup locally, but, for example, if my house burns down or Richmond sinks, I would still lose my data.

So I wrote a simple shell script to do automated incremental backup from my Linux file server to Amazon Glacier.


It’s one of my first shell scripts, so please do send advices my way.

To use the script, create a data directory somewhere (data inside this directory will be backed up), and also an empty backup directory somewhere – this is where backup archives will be stored, in addition to uploading to Glacier. It’s good to have a local copy of everything because if you need to restore from backup due to something like accidentally deleted files (not a harddrive failure), you can just restore from local archives instead of paying to retrieve them from Glacier.

The script, on every run, will either make a full backup (tar.gz file) of the data directory if no previous backups are found, or build an archive of only changed files since the last backup using tar’s incremental archive feature. In either case, it will upload the resulting archive to Amazon Glacier. It creates a log of files changed in each run, size of the incremental archive, and any errors, etc. Every few runs (7 by default), it emails the log file to the email address specified.

Most of the configuration options in the script should be self-explanatory.

This script requires glacier-cli to upload to Glacier. glacier-cli requires the boto library (Python library for accessing Amazon services), as well as python-iso8601 and python-sqlalchemy.

The script can be run by cron if you want scheduled backups. Frequency is up to you of course. I run mine once a day. It sounds like a lot, but it’s actually not much since most data remains unchanged most of the time, and tar uses file modification time to looked for modified files, and that’s very fast.

To force a complete backup, simply delete the “archive.snar” file in backup directory.

That’s it! If s*** happens, just get all the archives since the last complete backup from either Glacier or the local copy, and untar them from the complete backup to the latest in order. Same as extracting regular tarballs, but add an “–incremental” switch. I didn’t write a script to automate restoring from backup because that shouldn’t happen too often. I have verified that all the incremental stuff works by hand (to Glacier and back).

It’s very easy to add encryption, but I chose not to because it introduces an additional point of failure (losing decryption key or forgetting password), and a pretty significant one at that. It’d be what I believe the most likely failure mode of a backup scheme like this. I value data durability over privacy, and I’m not convinced Amazon will ever be interested in my data, so I store my data un-encrypted.

In my setup, I actually have both the data directory and backup directory on a RAID-5 array on my file server. The backup directory is not shared, so backups cannot be messed up by the user from a client PC. That means the only time I’d need to restore from Glacier is if I have a simultaneous drive failures, or if my house gets burned down, or have my server stolen physically. Most user-error-type scenarios can be covered by on-machine backups.

Why RAID when I already have backup? Because single hard drive failures are extremely common. I’ve had about 3 already, in my ~10 years of using computers. I have ~100GB of data, which would cost about $120 to retrieve from Glacier, and I would also potentially lose up to 1 day of work, which would be annoying. Also, hard drives are cheap.

PS. I have also written an article on setting up a DIY Linux Network-Attached Storage (aka. file server).