Just recently I finally got around to trying the fairly new raster support in PostGIS 2 and above; this allows you to store raster maps in a geographically enabled Postgres database alongside the more commonplace vector maps.
Although it could be used for storing background maps, raster support is especially useful for managing elevation data like the OS’s Terrain50 data.
So this (rather long) piece looks at loading raster data into a database and then accessing it. It assumes you already have a working installation of Postgres with PostGIS 2.x – if you don’t pop over to EnterpriseDB for the installers.
Loading the data
Loading raster data is currently done with a command-line tool that gets installed with Postgres when you add the PostGIS 2.x extension. So to load the OS terrain data (on Windows) you’ll need to create a quick batch file that does something like this:
for /R "D:\terrain50" %%I in ("*.zip") do ( "D:\Program Files\7-Zip\7z.exe" x -y -o"D:\terrain50\unzip" "%%~fI" )
raster2pgsql -s 27700 -I -C -M D:\terrain50\unzip\*.asc -F -t 200x200 os.terrain50 | psql -U postgres -d gisdb
Terrain50 data arrives in a large number of zipped archives located in a series of subdirectories for each OS tile (SU, HP, etc), so the first line uses 7-zip to unpack all the individual elevation raster images into a single directory, and second loads the data.
raster2pgsql creates an sql script to insert the ACSII Grid formatted elevation data files into a table, and to create the table if it doesn’t exist. The output of this is piped to the
psql command to execute the script.
Its worth noting a few of the options:
-s 27700sets the co-ordinate system to the OS SRID
-Icreates a spacial index on the data to significantly improve the table’s performance
-t 200x200is the tile size in pixels. Terrain50 data is supplied in 10km by 10km squares where each pixel represents 50 metres
os.terrain50is the target table
(running the command with no options will give you a full list, including how to specify the server, etc)
When its finished loading you should have a new table with three columns and a little over 2800 rows:
rid– a unique raster id number
rast– the geographical raster data in a single unreadable column
filename– the name of the source raster file (e.g SP50.asc)
Using the data
So you now have a table containing the elevation data for the whole of Great Britain. If you switch to QGIS and look in the DB Manager you’ll now see the table listed, and if you wanted to you could right-click on it and add it to your map canvas – if you really wanted to load the whole of the UK in one go.
So far everything feels familiar if you’re used to playing with vector data in a Postgres database. So here’s the first difference I’ve found – if you did add the table to your canvas and right-click on it there’s no “filter” option to reduce the data down to, say, a single tile. Equally, you can’t simply add an SQL query in the DB Manager to limit the scope.
Instead you need to create a database view of the data you want, and that can be loaded. But first…
Tiles? What tiles?
And here is the next trick I stumbled upon. The OS data is delivered in 10 km tiles – but which tile do you need for a specific area?
Charles Roper has the answer – he’s created a series of GIS shape files for each of the usual tile sizes. You can download them here.
If you load these into your database, then you can do quick look-ups to see which tiles you need, say, for a given postcode using the open data Codepoint:
SELECT p.postcode, osgb.tile_name
FROM os.osgb_grid_5km AS osgb,
os.codepoint AS p
WHERE ST_Within(p.the_geom, osgb.the_geom)
AND p.postcode = 'OX285FE';
Viewing the rasters
Back to viewing raster data. Lets say you wanted to load the terrain information for all of West Oxfordshire, you’ve loaded Charles Roper’s data and you have the free OS BoundaryLine data.
First you’ll need to create a database view which contains just the data you want with something like this:
CREATE OR REPLACE VIEW os.terrain50_west_oxon AS (
SELECT 1 AS rid, ST_Clip(ST_Union(t.rast), d.the_geom)
FROM os.osgb_grid_10km AS osgb,
os.district_borough_unitary AS d,
os.terrain50 AS t
WHERE ST_Intersects(d.the_geom, osgb.the_geom)
AND d.name LIKE 'West Oxfordshire %'
AND t.filename = osgb.tile_name || '.asc'
GROUP BY d.the_geom
(This could all be entered from DB Manager SQL Window within QGIS or externally using pgAdmin, etc.)
Lets unpack this a bit. The query uses three tables – the terrain rasters, Charles Roper’s 10km look-up table, and the District Council boundary data:
d.name LIKE 'West Oxfordshire %'limits the district table to just West Oxfordshire
ST_Intersects(d.the_geom, osgb.the_geom)picks only those OS tiles that intersect with West Oxfordshire
t.filename = osgb.tile_name || '.asc'links the list of tiles to the terrain file names
ST_Clip(ST_Union(t.rast), d.the_geom) merges all the geographical raster data into a single block (
ST_Union) and clips it to the West Oxfordshire boundary (
Now, when you return to the DB Manager in QGIS you’ll notice there is an additional raster layer, and adding that to the canvas will load just the clipped data.
At first sight this might all seem like a lot of work to load some raster data but once the data is loaded this process is actually pretty efficient, and creating a view to load a query is not really any more cumbersome than creating the script to load the query directly.
On the up-side, I’ve found the performance of indexed raster tables to be very, very good – once the script is ready it take runs in around 20 ms to create the view, and loading it into QGIS is just as quick as any vector data.
Also being able to load the tiles on-the fly with a simple look-up is so much easier than having to search for the right tiles, loading them individually, and then often having to create a virtual raster to merge the tiles into a single elevation model to work on.
This is all new to me, and the array of raster functions in PostGIS at the moment looks pretty daunting, but just being able to load raster maps on the fly without having to waste time looking up the right tiles is worth the effort – who knows where it will now take me as I start the learn it true potential!