Complementing National SDG Platforms

The Sustainable Development Goals (SDGs) are shaping global development, from how national policies and legislation are developed, how recipients of aid report their outcomes, to how all countries will report (and set baselines for) the state of their infrastructure, economy, and practically all elements relating to human development. From establishing the 17 Goals, 169 targets, and 232 indicators the inter-governmental discourse is now changing to how to monitor attainment of the SDGs. One facet of this is the technical one, on the 22nd to the 24th of January, the United Nations Statistics Division convened the National Platforms for SDG Reporting forum.

This convened countries, international organisations, and platform service providers to discuss differing approaches to monitoring the SDGs, technical implementations and innovations to discuss how to set recommendations and guidelines for such platforms. The outputs and presentations are all online too.

Given the nature of the forum, the data powering the platforms is drawn from National Statistical Offices and other data custodian agencies, as a component of their existing data collecting programs. For the inter-governmental process, this is essential – countries produce their own authoritative data and this will always be used as a primary source. But, what about secondary, complementary sources? This is compounded by methodological questions on how to analyse specific indicators.

For example, SDG Indicator 9.1.1 regards understanding the “Share of the rural population who live within 2 km of an all-season road”. Breaking this down:

  • How to differentiate between rural and urban populations – where do peri-urban areas lie?
  • How to calculate 2km from a road – is this in a straight line, what about geographic features, such as rivers and valleys?
  • How to define an “All season road”.

The question on urbanization/rural communities is a very common discussion, so I’ll leave it to one side. The last two points however do not have easy answers – and a lack of documented methodological guidance is severely hampering progression of the SDGs.

In many countries, road inventory is scant (see here), but there isn’t full GIS coverage of the road features, let alone an understanding of the quality. So, in effect, can we use what’s already out there, free and open data, to work out the SDGs?

Right now*, “80% of the world’s user-generated road map is more than 80% complete”. Global population data  – WorldPop and the High Resolution Settlement Layer – can offer insights into where people are globally.

So, in theory, sources like OpenStreetMap and WorldPop can already offer a level of insight that can assist with the methodological development of the goals. They could provide the data foundation for methodological development of the indicators and run in parallel with strengthening activities within the countries – ie. when the most aid dependent countries are capable of producing authoritative data for the indicators, technical platforms and methods will already exist to transform these data into actionable insights and drive data-driven policy development.

The SDGs should leave no-one behind and while attainment of the SDGs should be country-owned and country-led, what actions can the global commons take to work in parallel to support this agenda? An open test bed for SDG methodological analysis, modeled on OSM Analytics perhaps? Authoritative data and the choice of whether to use complementary/user generated/crowdsourced (etc etc) data will rest within the National Statistics/Mapping/Data Agency, there is an opportunity to develop the methods alongside this strengthening process – that way, when the agencies are strengthened, they’ll have a method to use for their indicators, targets, and goals.

*It’s using OpenStreetMap, OSM has a variety of sources, from crowdsourced data to imports. But… something’s there…!


Moving On, So Long, Farewell and Hello!

The past three months have bought about a massive change, both personally and professionally. I’ve moved to New York, to take up a post within the United Nations’ Committee of Experts for Global Geospatial Information Management or UN-GGIM for short. Acronyms aside, it’s been a hard, but fun change with much challenging work ahead!

In moving from operational work (World Bank in Tanzania and with the N/LAB at the University of Nottingham) to a secretariat role, I’m hoping to bring a little field experience to Headquarters, especially in terms of supporting the development of policy and direction for geospatial data.

With geospatial information at the heart of global policy, from the 2020 round of Census to the 2030 Sustainable Development Agenda, the next few years will be very exciting!

Looking back, I’m quite sad to to leave my previous colleagues behind, especially given the opportunity to build and lead the Tanzania Urban and Geospatial team from its inception – or to turn the phrase of my then boss Edward Anderson in 2011 “Build a mapping empire!”…

The 2017 World Bank’sPresident’s Innovation Award Winners (from left, Deogratias Minja, Edward Anderson, Justina Kajenge, Myself, and Freddie Mbuya)

From the pilot of mapping the Tandale neighbourhood in 2011 mapping ~65,000 people, Ramani Huria grew, from an initial start of 10 neighborhoods to scaling across Tanzania and supporting global projects like Missing Maps and national ones like Crowd2Map, while delivering its main mission of building flood resilience in Dar es Salaam. The Zanzibar Mapping Initiative has mapped Unguja Island at an unprecedented 3.5cm – 7cm resolution with Drones and laid a foundation for improved geospatial collaboration in Zanzibar. These innovative projects (and others) are now being mainstreamed into frontline operations through the Tanzania Urban Resilience Program – taking an innovation-led approach to capacity development operations, integrating innovation and research into projects, as opposed to leaving it as a footnote. Academically, surviving a PhD and helping bring the N/LAB Centre for International Analytics into being at the University of Nottingham are fond memories – bar the last few weeks of the PhD!

Ultimately, what has been the most rewarding has been the people, whether it has been Ward Officers in Tanzania; blue-sky scheming with Edward, Josh, and Samhir on the direction of the World Bank’s innovation work; being academically challenged with James, Gavin, and Andrew; Learning with the commons with Willow, Dirk, Giuseppe, Heather, Mikel, Florian, Sara, Tyler, Ivan and Nico; to being in the thick of one of the fastest growing cities in Africa with Msilikale, Deo, Beata, Devotha, Daud, Rashid, Roza, Darragh, and many others, too many to count.  Thank you to you all – it was an honour.

Hopefully more Bernard than Sir Humphrey!

To win against the tide, though somethings never worked out (ie. jet ski lidar for inshore bathymetry!) friendships were made, and there was a little bit of love there too! Looking back at six years, 3 months on it really was a wonderful journey, here is to the next one. So, with that in mind, I am ex  field ops, I have shuffled off mapping, ran down the scale bar and joined the international civil service in a secretariat – in time, leaning towards more Bernard than Sir Humphrey!

Towards the Next Generation Road Survey

Over the past few weeks, I’ve managed to escape the office and get back to the field. With an impending change, it’s been a very refreshing time to get back into the mix – especially out onto the roads of Zanzibar.

Alongside work with scaling out Ramani Huria and working with (awesome!) colleagues on the signing of an Memorandum of Understanding between Ardhi University, World Bank, and DfID to support the development of curriculum (with ITC Twente) and the sustainability of community mapping in Tanzania for the next five years. I’ve been working on a side project to look at how machine learning can be used to assess road quality.

To do this, the N/LAB team at University of Nottingham and Spatial Info (the spin out of my team that helped build Ramani Huria/Tanzania Open Data Initiative) and I are working with the Zanzibari Department of Roads, under the Ministry of Infrastructure, Communications and Transport to survey all roads in Ugunja Island Zanzibar.

The Department of Roads & Uni Nottingham Team

So far, we’ve worked on getting a surveying vehicle back on the road, initial back and forth with government stakeholders, and working on pulling together the various road data sources (such those from the Government and OpenStreetMap) to work out where to drive and the sequencing of the survey. All of this will support a data collection protocol that merges traditional surveying techniques, with novel ones such as RoadLab Pro.

All of these data streams will then be used as a training dataset to see how machine learning can inform on road quality. But first, we’re getting the traditional survey underway. It’s going to be long road ahead – as long as all the roads in Zanzibar!

Watch this space, the project’s Medium page, and the N/LAB’s blog on using machine learning for automated feature detection from imagery. Get in contact below in the comments as well.

Written in the Al-Minaar Hotel, Stone Town, Zanzibar (-6.16349,39.18690)

OSM: Going Back in Time

I’ve been playing around with the full planet file to look at going back in time in OSM. Mainly, this is to look at how Ramani Huria’s data has evolved over time and is all part of extracting more value from Ramani Huria’s data.

I’ve been playing around with the full planet file to look at going back in time in OSM. Mainly, this is to look at how Ramani Huria’s data has evolved over time and is all part of extracting more value from Ramani Huria’s data. This process wasn’t as straightforward as I had hoped, but eventually got there – also, this isn’t to say that this is the only or best way. It’s the one that worked for me!

To do this, you’ll need a pretty hefty machine – I’ve used a Lenovo x230 Intel i5 quad core 2.6ghz, 16gb of ram with over 500gb of free space – This is to deal with the large size of the files that you’ll be downloading. This is all running on Ubuntu 16.04.

Firstly, download the OSM Full History file. I used the uGet download manager to deal with the 10 hour download of a 60gb+ file over 10meg UK broadband connection. Leaving it overnight, I had a full file downloaded and ready for use. Now to set up the machine environment.

The stack is a combination of OSMIUM and OSMconvert. On paper, the OSMIUM tool should be the only tool needed. However, for reasons that I’ll come to, it didn’t work, so I found a workaround.

OSMconvert is easily installed:

sudo apt-get install osmctools

This installs OSMconvert other useful OSM manipulation tools. Installing OSMIUM is slightly more complicated and needs to be done through compiling by source.

Firstly, install LibOSMIUM – I found not installing the header files meant that compilation of OSMIUM proper would fail. Then use the OSMIUM docs to install OSMIUM. While there is a package included in Ubuntu for OSMIUM, it’s of a previous version which doesn’t allow the splitting of data by a timeframe. Now things should be set up and ready for pulling data out.

Dar es Salaam being the city of interest, has the bounding box (38.9813,-7.2,39.65,-6.45) – you’d replace these with the South West, North West point coordinates of your place of interest, and use OSMconvert, in the form:

$ osmcovert history_filename bounding_box o=output_filename

osmconvert history-170206.osm.pbf -b=38.9813,-7.2,39.65,-6.45 -o=clipped_dar_history-170206.pbf

This clips the full history file to that bounding box. It will take a bit of time. Now we can use OSMIUM to pull out the data from a date of our choice in the form:

$ osmium time-filter clipped_history_filename timestamp -o output_filename

osmium time-filter clipped_dar_history-170206.pbf 2011-09-06T00:00:00Z -o clipped_dar_history-170206-06092011.pbf 

This gives a nicely formatted .pbf file that can be used in QGIS (drag and drop), POSTGIS or anything else. As the contrast below illuminates!

Tandale, Dar es Salaam, Tanzania – 1st August 2011
Tandale, Dar es Salaam, Tanzania – 13th February 2017

Enjoy travelling back in time!

All map data © OpenStreetMap contributors.

FOSS4G 2018: Dar es Salaam

This will be a different FOSS4G. As the first in a developing country, our vision is for an accessible and inclusive FOSS4G, a FOSS4G for All.

In 2013 I was fortunate to be on the Local Organizing Committee (LOC) of OSGEO’s Free and Open Source For Geospatial Annual Conference – aka FOSS4G. Each year, the location of FOSS4G is decided on the basis of Europe, North America and “Rest of the World”.

As Dar es Salaam, Tanzania is in the “Rest of the World”, when the call for proposals was released, myself and a small group of co-conspirators figured that we’d submit a letter of intent for Dar to be the host city. Ultimately, Dar es Salaam was selected by the OSGeo Board to be host city for 2018, co-chaired by myself and Msilikale Msilanga.

This will be a different FOSS4G. As the first in a developing country, our vision is for an accessible and inclusive FOSS4G, a FOSS4G for All. 

In aligning FOSS4G 2018 within the Sustainable Development Agenda, we aim to complement the existing OSGeo stable while empowering the Open Data, Participatory Mapping and Internet of Things work currently underway within Tanzania, the region and worldwide. This will expand the scope of the conference from traditional geography and location tech, to applications, use and best practices of our tools.

Location and geography are at the heart of the Sustainable Development Goals: FOSS4G 2018 in Dar es Salaam would invigorate our existing projects, bringing them to new users and developers while supporting and nurturing the existing OSGeo community.

The clock has started for the LOC to organise the conference and the press release from OSGeo is out! It’s going to be a few months till FOSS4G 2017 in Boston till Msilikale, our LOC and I will pick up the keys to the FOSS4G car. In the meantime the Dar LOC will assemble and start laying the groundwork for 2018. It’s going to be a long and rewarding road ahead!

Building Heights in Dar es Salaam

I first went to Dar es Salaam in 2011, there were a few skyscrapers adorning the city’s skyline, now they’re everywhere! Sitting on a rooftop bar in the center of the city, it’s a mass of cranes and pristine new buildings.

Alongside this rapid growth, Ramani Huria has been collecting a lot of data but a lot of it doesn’t get rendered by the default OSM styles… so I’ve dug into the data and created a map of the different floors across the city.

This interactive map allows you to explore where the tallest buildings are in the city, but in displaying the data in this way, also allows for the densest, unplanned and informal areas of the city to become very clear.

There is still some way to go though – in Dar es Salaam there are around 750,000 buildings, with roughly 220,000 (~30%) having been surveyed by the Ramani Huria team and given an appropriate attribute. Ramani Huria has focused its efforts in the urban centres of Dar es Salaam, where most of the multi-story buildings are to be found. But, still a lot more to be covered towards Bagomoyo and Morogoro.

Hat tip to Harry Wood who’s advice and guidance pointed me in the right direction – a more technical blog post and more details of other challenges around correctness of tagging but that’s for another post – now to look at Floor Spaces Indices…!

GDAL Nodata and Drone Processing Pipeline

Due to a nodata tag issue within the GeoTiff standard, Pix4D doesn’t generate the nodata value in a GDAL compliant manner. As such, when importing GeoTiffs into software using GDAL (like QGIS or Mapbox) the nodata value is black, meaning overlapping layer issues. Additionally, for an unknown reason, Mapbox does not deal with multiple GeoTiffs in the same manner as it displays .mbtiles, pixellating uploaded GeoTiffs. This short blog presents a processing pipeline to deal with both of these issues.

The solution for translating the nodata values has been documented here: I’ve written a simple bash script to convert this in a UNIX environment is here: Basically, run the script in the folder where files are found and it will iterate through each file, translating the nodata tag value to be GDAL compliant.

Once you’ve got the nodata/transparency values resolved, it’s quite easy to add your Geotiffs to Mapbox’s Tilemill. Export your resulting map as .mbtiles and upload this to Mapbox as a tiles. In the Mapbox Studio Classic interface, add this to a map and voilà, a drone imagery online.