Calculating number of overlaps between polygons

A common application within GIS is to create buffer / catchment areas – an interesting analysis is then to see how many overlaps occur with the each catchment indicating whether a site is under or over provisioned. This can be achieved using Postgis and the following fantastic tutorial.

This can you allow you to produce maps like the following:


Below gives a snapshot of the process:

create table boundaries_allot as select ST_Union(ST_ExteriorRing(geom)) from
(select (ST_DumpRings((st_dump(geom)).geom)).geom as geom from allotment_buffer) q 

This has been altered from the tutorial to include multi polygons. ST_Dump breaks down multi polygons into single polygons and ST_DumpRings provides the interior and exterior rings of polygons (for complex polygons)

CREATE SEQUENCE polyseq_allots;
CREATE TABLE polys_allots AS
SELECT nextval('polyseq_allots') AS id, (ST_Dump(ST_Polygonize(st_union))).geom AS geom
FROM boundaries_allot;

This creates individual polygons for each multi-linestring and uses ST_Dump to switch any multi polygons to single polygons.

UPDATE POLYS_allots set count = p.count
  SELECT count(*) AS count, AS id  
  FROM polys_allots p 
  JOIN allotment_buffer c 
  ON ST_Contains(c.geom, ST_PointOnSurface(p.geom)) 
) AS p

Once we have the individual polygons, by using the centroids of the new small polygons with the set of original circles, we can calculate how many circles contain each centroid point.

Creating ArcMap tools using python scripts: ASCII to Raster batch processing

Recently, I had to process 400 tiles of LiDAR data and find a method that would allow beginners to GIS to achieve the same in the future. For this I decided to create a custom tool in ArcMap which uses a python script to batch process the ASCII files in rasters and then merges these into a final DEM. The users can then start processing this layer for their individual needs.


# Import system modules
import arcpy, os, sys

# Set local variables

inASCII = str(sys.argv[1]) #these variables are linked to the data the user will input when running the tool.
outRaster = str(sys.argv[2])
mergeGeodatabase = str(sys.argv[3])
outputFilename = str(sys.argv[4])
rasterType = "FLOAT"
emptyString = []
britishNationalGridprj = "PROJCS['OSGB 1936 / British National Grid',GEOGCS['OSGB 1936',DATUM['D_OSGB_1936',SPHEROID['Airy_1830',6377563.396,299.3249646]],PRIMEM['Greenwich',0],UNIT['Degree',0.017453292519943295]],PROJECTION['Transverse_Mercator'],PARAMETER['latitude_of_origin',49],PARAMETER['central_meridian',-2],PARAMETER['scale_factor',0.9996012717],PARAMETER['false_easting',400000],PARAMETER['false_northing',-100000],UNIT['Meter',1]]"

#Set path names and convert from windows to python

inASCII_convert = inASCII.replace('\\','//')
outRaster_convert = outRaster.replace('\\','//')
mergeGeodatabase_convert = mergeGeodatabase.replace('\\','//')

#Convert from ascii to raster for each individual file

for files in os.listdir(inASCII_convert):
	if files.endswith(".asc"):
		arcpy.ASCIIToRaster_conversion(inASCII + "//" + files, outRaster + "//" + files.replace(".asc",".tif"), rasterType)
#create string for raster mosaic

for outputs in os.listdir(outRaster_convert):
	if outputs.endswith(".tif"):
		emptyString.append(outRaster_convert + "//" + outputs)
	outputString = ';'.join(emptyString)
#merge all tiffs together
arcpy.MosaicToNewRaster_management(outputString, mergeGeodatabase_convert, outputFilename, britishNationalGridprj ,"16_BIT_UNSIGNED","" , "1","" ,"" )

Creating a tool in ArcMap:

Using ArcCatalog, find the location where you want the toolbox and right-click to select ‘New Toolbox’. From here go to Add -> Script and follow the prompts. When you reach the properties page we need to select four pieces of information the user enters to relate to the 4 inputs that were specified in the script using the sys.argv command. Note these need to be in the same order. Choose a name for the input, select whether it is a input/output in the environment property and then select the type.


The first two relate to folders where all the files are stored, the third to the folder where the output is stored and the forth is for the output file name.

Now you can can run the tool just like any others in toolbox.


Processing OS Codepoint with polygons

Following on from an earlier post updating OS Codepoint data , this post will now look at how to process OS Codepoint with polygons. This dataset provides a polygon for each postcode, including vertical streets (represented by squares) which is where a building has more than one postcode.

1) From the data provided by OS all the shapefiles (for each postcode sector) need to be merged into one. This can be achieved using QGIS ‘Merge to one’ tool under the vector toolbar or by writing a simple python script in arcmap utilising the merge tool.

2) We are going to create two extra fields in the attribute table ‘PCOutcode’ and ‘PCSector’ – this will provide us with additional symbology options with GIS. For ‘PCOutcode’, use =RTrim(Left(Postcode,4)) to capture the first part of each postcode, and for ‘PCSector’ use =RTrim(Left(Postcode,5)) where Postcode is the postcode field and RTrim removes any trailing spaces.

Continue reading

Processing OS Codepoint Data

Working in a professional GIS environment,you will undoubtedly be dealing with Ordnance Survey data – whether open or private. One of the most useful of these is OS codepoint which provides a precise geographical location for 1.6 million postcodes within Great Britain.

This is the basis for geocoding, one of the most commonly completed tasks in GIS. This posts describes the method for creating an address locator ready for geocoding.

1. Firstly, load the data from the CD and delete (or use to a create a new address locator) the bt.csv file which represents Northern Ireland.

2. Then merge the csv’s together using the command line (make sure you are in the directory where the files are stored).

copy *.csv insert_new_file.csv

Continue reading

Forest fire risk map for Greece

During my postgraduate degree I had to design a risk map for the spread of forest fires throughout Greece. This was created using different topographical layers and various indices which affect the spread of forest fires, and providing a table of risk for each layer based on their values. These values were then summed to create an overall risk map. Each layer is presented below (CTI = Compound topographic index; HLI = Heat load index).

Continue reading

QGIS plugin: Mask

QGIS has many fantastic plugin’s, but one of these I use most in my day to day work when creating maps is the Mask tool. This allows me to either completely, or partially, mask the base map outside my area of interest, so the user can concentrate fully on the features you have mapped within their area of interest.

It’s so simple. Just select the polygons you want outside of which you want the mask and then enter the symbology parameters. I personally make the mask completely white to remove the base map but you could alter transparency levels to really bring the base map alive within the polygon.

Completely white base map outside of London boroughs

Completely white base map outside of London boroughs using a shapeburst effect

Continue reading

Open spaces shapefile for London

An earlier post looked at how to extract shapefile data from Open Street Map.

This is very useful for collecting data that is not provided easily in the public realm, although caution must be taken when deciding if the data is complete. One case where I needed a shapefile for a project, and I was very certain that set would be complete, is that of open spaces in London.

This was achieved by searching the features database and locating any key:value pairs of interest. These were the following:

  1. Landuse: Cemetery, clearing, conservation, forest, garden, grass, meadow, nature reserve, park, recreation ground, recreational, sport, village green and wood.
  2. Leisure: Sports centre, golf course, common, pitch and playground
  3. Natural: Grass, grassland and woodland

This was probably a few too many features, so some were removed after a comparison with OSM to see what I had and had not captured.

Also, from these selections, I decided to create two layers; an open space (everything that does not require payment/membership to enter) and green spaces (everything). Furthermore, I separated these into open access and private access by searching for the tags access=yes and access=private in the ‘other’ field of the shapefile.

Continue reading

Extracting layers from open street map using osmosis and overpass API

Thanks to the good folk developing Open Street Map (OSM), we have a fantastic mapping resource that is completely free. Coverage is quickly expanding, as well as detail, and its these advancements that provide an opportunity to create gis layers that we would otherwise struggle to find in the public domain.

My first need of this was when I needed a shapefile of all the open/green spaces in london, and after many hours trying to locate this resource without paying a high fee or waiting several weeks, I decided to look into OSM.

OSM contributors provide a tag for every object they map, and this provides the basis for extracting objects of interest.

Firstly, we need some data! Probably the easiest place to collect this is from geofabrik. Here you can download maps at various different geographical scales in both .shp and .osm format. If you want to just extract general attributes e.g. landuse, leisure, natural etc. then downloading the shapefile can provide this straightaway. If you want to go deeper then you will need the .osm file. After the data was collected, I used two different methods to extract data.


Osmosis is a command line interface that allows extraction based on the key:value tag provided by the contributor e.g. landuse would be the main tag, with agriculture being a specific key under that tag. Osmosis can be easily installed on windows with instructions here.

There were two main commands I used to for extraction: Firstly to extract nodes (i.e. any object represented by a point; like a bus stop):

 osmosis --rbf layer_name.osm.pbf -–node-key-value keyValueList=”key.value” -–wx outputfile.osm

Secondly, the following was used for extracting ways (i.e. polygons):

osmosis --read-xml layer_name.osm –-way-key-value keyValueList = “key.value” –-used-node –-wx outputfile.osm

Note: Make sure you are in the directory where your map is stored.

Note 2: When extracting ways (i.e. polygons) you need to use a .osm file as the input. The tool osmconvert can easily convert from pbf to osm.
Continue reading

Time-series analysis of albedo on the Agassiz Ice Cap, Canada

The final glacier analysed was the Agassiz Ice Cap (AIC) shown in Fig. 1.


Figure 1: Study site map for Agassiz Ice Cap. Coordinates refer to UTM (Universal Trans Mercator)
zone 18N. Contour spacing is marked at 500 m and AWS refers to the automatic weather station from which
data was collected. The inset shows shows the location of AIC within the Canadian  Archipelago with coordinates referring to the NSIDC sea ice polar stereographic north system.

The results were broken down into three sections. Firstly, daily albedo measurements provide an overview of the behaviour of albedo throughout the melt season and allow short term fluctuations from the norm to be quantified (Fig. 2).


Figure 2: Seasonal evolution of daily albedo on AIC.

Secondly, yearly averages were calculated including maximum and minimums (Fig. 3).


Figure 3: Yearly evolution of albedo on AIC.

Continue reading

Time series analysis of albedo on Humboldt Glacier, Greenland

Following on from the previous post, the second glacier analysed was the Humboldt Glacier (HG) shown in Fig. 1; the largest in Greenland. This was used to provide method validation with Box et al (2012), which encompassed the whole of the Greenland Ice Sheet (GrIS), as well as a more detailed and localised analysis of individual glacier albedo behaviour. A small yearly discrepancy of 0.0006 was seen which provides confidence in the methods to replicate both short and long – term albedo variability.

Figure 1: Study site map for Humboldt Glacier. Coordinates refer to UTM (Universal Trans Mercator)
zone 20N. Contour spacing is marked at 200 m and AWS refers to the automatic weather station from which
data was collected. The inset shows shows the location of Humboldt Glacier within Greenland and the
Canadian Archipelago with coordinates referring to the NSIDC sea ice polar stereographic north system.

The results were broken down into three sections. Firstly, daily albedo measurements provide an overview of the behaviour of albedo throughout the melt season and allow short term fluctuations from the norm to be quantified (Fig. 2).

Figure 2: Seasonal evolution of daily albedo on HG.

Secondly, yearly averages were calculated including maximum and minimums (Fig. 3).

Figure 3: Yearly evolution of albedo on HG.

Finally, regional maps were created highlighting the spatial relationship of albedo fluctuations throughout the time series (Fig. 4).

Figure 4: Regional evolution of albedo on HG.

Key points from these figures are as follows:

  •  Average albedo across the glacier (αyearly) is decreasing at a rate of 0.0042 ± 0.0011 year-1 with minimum and maximum trends following suit.
  • Analysing average monthly albedo shows that June and July were largely responsible for yearly declines with July albedo decreasing at rate twice that of αyearly (0.0086 ± 0.0023 year.-1). Almost no trend existed for August and September.
  • Regionally 87% of the glacier has seen a decline in albedo.
  • Two seasons saw significant deviations (σ) from daily average albedo (αdaily) with mid-June to early-August 2008 measuring 1 σ below, and early-June to mid-August 2012 measuring 2 σ below at peak melt.
  • Daily melt season anomalies, defined as > 1 σ below αdaily, were five times more common in the period 2008 – 2013 than 2001 – 2007.
  • Albedo standard deviations are at their largest during times of higher melt.

References: Box, J. E., Fettweis, X., Stroeve, J., Tedesco, M., Hall, D., Ste en, K., 2012. Greenland ice sheet albedo feedback: thermodynamics and atmospheric drivers. The Cryosphere 6 (4), 821 – 839.