OSM: Going Back in Time

I’ve been playing around with the full planet file to look at going back in time in OSM. Mainly, this is to look at how Ramani Huria’s data has evolved over time and is all part of extracting more value from Ramani Huria’s data.

Advertisements

I’ve been playing around with the full planet file to look at going back in time in OSM. Mainly, this is to look at how Ramani Huria’s data has evolved over time and is all part of extracting more value from Ramani Huria’s data. This process wasn’t as straightforward as I had hoped, but eventually got there – also, this isn’t to say that this is the only or best way. It’s the one that worked for me!

To do this, you’ll need a pretty hefty machine – I’ve used a Lenovo x230 Intel i5 quad core 2.6ghz, 16gb of ram with over 500gb of free space – This is to deal with the large size of the files that you’ll be downloading. This is all running on Ubuntu 16.04.

Firstly, download the OSM Full History file. I used the uGet download manager to deal with the 10 hour download of a 60gb+ file over 10meg UK broadband connection. Leaving it overnight, I had a full file downloaded and ready for use. Now to set up the machine environment.

The stack is a combination of OSMIUM and OSMconvert. On paper, the OSMIUM tool should be the only tool needed. However, for reasons that I’ll come to, it didn’t work, so I found a workaround.

OSMconvert is easily installed:

sudo apt-get install osmctools

This installs OSMconvert other useful OSM manipulation tools. Installing OSMIUM is slightly more complicated and needs to be done through compiling by source.

Firstly, install LibOSMIUM – I found not installing the header files meant that compilation of OSMIUM proper would fail. Then use the OSMIUM docs to install OSMIUM. While there is a package included in Ubuntu for OSMIUM, it’s of a previous version which doesn’t allow the splitting of data by a timeframe. Now things should be set up and ready for pulling data out.

Dar es Salaam being the city of interest, has the bounding box (38.9813,-7.2,39.65,-6.45) – you’d replace these with the South West, North West point coordinates of your place of interest, and use OSMconvert, in the form:

$ osmcovert history_filename bounding_box o=output_filename

osmconvert history-170206.osm.pbf -b=38.9813,-7.2,39.65,-6.45 -o=clipped_dar_history-170206.pbf

This clips the full history file to that bounding box. It will take a bit of time. Now we can use OSMIUM to pull out the data from a date of our choice in the form:

$ osmium time-filter clipped_history_filename timestamp -o output_filename

osmium time-filter clipped_dar_history-170206.pbf 2011-09-06T00:00:00Z -o clipped_dar_history-170206-06092011.pbf 

This gives a nicely formatted .pbf file that can be used in QGIS (drag and drop), POSTGIS or anything else. As the contrast below illuminates!

tandale_01082011
Tandale, Dar es Salaam, Tanzania – 1st August 2011
tandale_2017_lowres
Tandale, Dar es Salaam, Tanzania – 13th February 2017

Enjoy travelling back in time!

All map data © OpenStreetMap contributors.

Starting Ramani Huria – Mapping The Flood Prone Areas In Dar es Salaam

Four years ago, in August 2011 I was fortunate to manage the community mapping of Tandale. It was an experience that irrevocably changed my professional direction and interests. Over a month I trained and worked alongside brilliant students and community members, who were all focused on getting an open map of Tandale, something that had never been accomplished previously. When it was done, the reception across civil society and government was positive and intentions on scaling the pilot to the city were mooted but for one reason or another it never quite made it. Then in December, floods hit the city. In dense informal urban environments such as Tandale these floods are fatal and dramatically change the landscape as well as causing mass damage to survivor’s livelihoods and assets. Mitigating these floods are hard – where do you start in the fastest growing city in Africa? The population as of the 2012 census currently stands of 5 million, with projections showing it could grow to 10 million by 2030.

This rapid and unplanned urbanisation is in part the cause of flooding: the infrastructure with which to cope with high rainfall, such as drains and culverts, were not built alongside residential dwellings. This is especially acute in the unplanned, informal urban settlements where a majority of Dar es Salaam’s residents reside. The theory here is quite simple: If that if you can identify where it floods, you can either install or upgrade infrastructure to ameliorate the situation for residents. Unpacking this, the crux of the issue falls to two main points, governance and data.

Ramani Huria – Swahili for “Open Mapping” – is a operationalization of this theory of change. In March 2015, a coalition from across Tanzanian society, composed of the City Council of Dar es Salaam, the Tanzanian Commission for Science and Technology (COSTECH – under the Ministry of Science, Communication and Technology), the University of Dar es Salaam, Ardhi University, Buni Innovation Hub supported by the Red Cross and World Bank supported the inception of Ramani Huria, with the goal of mapping flood prone areas in Dar es Salaam, making this data openly available and supporting the use of this data into government where decisions can be made to mitigate flooding.

Mapping Phases
Mapping Phases

It is a far cry from 2011 where just mapping the ward of Tandale was a large task. Ramani Huria consists of a pilot phase and four subsequent phases. To pilot, the wards Ndugumbi, Tandale and Mchikichini, with a combined population of over 100,000 residents were mapped in series. This process combined 15 students matched with community members, leading to maps of all features within that community. This information, focusing on drainage and water ways, is critically needed to help understand and locate flood prone areas; this is high priority in Dar es Salaam due to the damage that annual floods wreak upon the city and its residents. In this piloting phase, conducted from March to the end of June these three wards were mapped, in part to generate the data that will generate flood inundation models and exposure layers but also to pilot the data model and gel the team, prior to Phase One.

Scale Up Workshop
Scale Up Workshop – https://www.facebook.com/ramanihuria

Phase one on paper is quite simple. Take 150 students from the University of Dar es Salaam’s Department of Geography and Ardhi University’s School of Urban and Regional Planning on industrial training, hold an inception workshop, deploy this contingent across six wards and work with community members to replicate the pilots, but running in parallel. At the time of writing, mapping is ongoing in six communities: Msasani, Keko, Makumbusho, Mabibo, Makurumla and Mburahati. According to the 2012 NBS census, these wards have a combined population of over 280,000 residents. Phase one was kicked off on the 6th of July and will run until the 14th of August.

Field Survey - https://www.facebook.com/ramanihuria
Field Survey – https://www.facebook.com/ramanihuria

Phases Two and Three, will integrate community volunteers from the Red Cross, these volunteers are committed to creating community level resilience plans. These plans will use the data produced by the mapping to create resident evacuation routes and aid Ward Exective Officers with planning decisions among many other uses. Additionally, with embedded long term volunteers monitoring change in their wards, this will hopefully result in detailed up-to-date maps in rapidly changing urban areas.

InaSAFE Training - https://www.facebook.com/ramanihuria
InaSAFE Training – https://www.facebook.com/ramanihuria

Phase Four unfortunately sees the students depart from the project, due to their graduation. With a remaining contingent of around 30 mappers, mapping will continue until February 2016. These phases cover the data component, consequently alongside these phases are dedicated training events aimed at building capacity to use and deploy this data in real world situations. On the 20th July the first such workshop series took place, with representatives from the Prime Minister’s Office for Disaster Management Department being trained in spatial analysis in QGIS and risk modelling using the QGIS plugin InaSAFE. A series of these workshops will take place, placing the data into the hands of those responsible for the city.

While this is ongoing in Dar es Saalam, you could get involved wherever you are in the world, through the Missing Maps project. Missing Maps is a collaboration between the Red Cross, Doctors Without Borders and Humanitarian OpenStreetMap Team, aimed at digitising “the most vulnerable places in the developing world”, but primarily do so by crowdsourcing the digitisation of aerial imagery. At the moment, there are three tasks for Dar es Salaam:

By helping digitise the buildings and roads, using the recent drone and aerial imagery, the process of mapping is faster, allowing the community mappers to focus on the detail of flood data. Additionally, the data from Ramani Huria is all placed into OpenStreetMap, its code is on Github and content available from Flickr and Facebook, all with an open licence. Please get involved!

 

Written on a plane somewhere between Tanzania and the United Kingdom

GISRUK 2013

On the 3rd to the  5th of April I attended GISRUK (Geospatial Information Research in the United Kingdom) to give a paper on Community Mapping as a Socio-Technical Work Domain. In keeping with Christoph Kinkeldey‘s love of 1990s pop stars Vanilla Ice made a second slide appearance, leveraging the fact it’s a very technical academic title. In short I’m using Cognitive Work Analysis (CWA) to create a structural framework to assess the quality (currently defined by ISO 19113:Geographic Quality Principles – well worth a read…) where there is no comparative dataset.

CWA is used to assess the design space in which a system exists, not the system itself. In taking a holistic view and not enforcing constraints on the system you can understand what components and physical objects you would need to achieve the values of the system and vice-versa. In future iterations I’m going to get past first base and look at decision trees and strategic trees to work out how to establish the quality of volunteered geographic data without a comparative dataset. Building quality analysis into day one, as opposed to being an after thought.

Written and submitted from Home (52.962339,-1.173566)

 

H4D2 April 12th – 14th

The HXL-Team
The HXL-Team

Last year I attended the H4D2 (Humanitarian for Disaster 2.o) organised by (and at) Aston University and Geeks Without Bounds. One of the outputs that I worked on was the HXL Extractor. Basically take data out of  GeoSPARQL, a geospatial semantic database and fire it into a GIS program. One of the team members had already been experimenting with and semantic databases and triplestores (this was most definitely a good thing, allowing us to move quickly) so our ‘mission’ was to create a middle layer to connect to a triplestore, then using the WFS-T standard to fire the extracted data into a GIS program of your choice. Interestingly the ‘project lead’ was communicating with us from Geneva via Skype, this and the prior work bellies the need for clear and concise problem statements prior to the hack. Because some of the team had been able to think about what they had to do we’d been able to work more effectively, even while learning technologies on the fly.

Going to the International Conference for Crisis Mapping Hackathon in Washington a few months later, HXL was still going strong and I got to meet the instigator of the project CJ Hendrix face to face. He’d amassed a team which went on to rightly take first prize at ICCM, now its being used by by UNOCHA with papers forthcoming. The project is growing, as evidenced by the amount of work going on in the team repository. Understandably our small team in Birmingham just did a little bit, but every little bit, helps.

Now H4D2 is coming around again on April 12th – 14th. This will then be followed up by SMERST (Social Media and Semantic Technologies in Emergency Response) a more academic focused conference on April 15th – 16th. Most importantly, you didn’t need to code to contribute, all are welcome from designers, videographers, bloggers, journalists and you! Registration for the H4D2 is open and is again at Aston University in Birmingham. Register here: http://h4d2.eu/registration. It’s going to rock.

Written and submitted from the Serena, Dar Es Salaam (6.810617, 39.288284)

WherecampEU Rome 2013 Musings

WhereCampEU this year, rather earlier than normal, was in the Eternal City of Rome, Italy. After the threatening of Snowmeggeddon in the UK, a jaunt to Italy was a welcome respite. An action packed unconference timetable started with a presentation on Taarifa by myself. This was a follow on presentation from W3G but focusing on the characteristics of developing technology; needing to know the users and how they’ll use the ‘solution’. Developing solutions to first world problems then applying in the developing world isn’t useful and is dangerous, however, is the method de jure in some organisations.

A presentation on how the World Food Program uses the OpenDataKit, for collecting information in South Sudan followed. It would have been interesting to have heard more about the rationale and why they were using what they were using. The use-case was a take picture, see what is about, the intelligence that they sought to gather. However, the presenter didn’t stay around, so if anyone in the geo-sphere knows, please get in touch!

CartoDB was given a live demonstration. We’re quickly moving past the desktop for GIS and spatial analysis and into the cloud. I’d like to know how these cloud based GIS services compare with ESRIs and MapBox’s offerings. It’s a brave new world!

Michael Gould‘s 37 things you didn’t know about ESRI was a passionate talk about ESRI from its inception to the present day. A leviathan in the GIS space, the culture is seemingly anything but corporate America. In the examples mentioned the social conscious dominates decisions; from the positing of boulders on the ESRI campus to the acquisition of new companies.

A Taarifa breakout design session occurred with a special guest appearance from a snow-bound London. But more on this in a later blog post.

The day ended with an OSM Q&A by myself and Shaun McDonald turned into a wide ranging discussion about the OSM project and the challenges within. Getting new contributors to keep contributing was one point of discussion as was the need for improved internationalisation and languages.

An evening of Pizza, Dolcé and Grappa followed. The night ended in a spectacular deli/bistro/bar known only to locals and lost where campers. Bottles of Chanti and Prosecco were enjoyed and toasts made.

Standing out the following day was Laurence Penny‘s updated 1-D Maps . It’s never the same things, constantly reinventing itself with from the acquisitions and collection held by Laurence. Going from Doom, the Mille Miglia to Roman Era Road Routing with a detour around the metros and undergrounds. It was 2 hours long. Words fail to describe the brilliance that emanates from the presentation. I really look forward to seeing it in an updated form.

A certain Henk Hoff of the OSM Foundation, brought proceedings to a close on a wide ranging discussion on the foundation, how it functions and operates. The day and conferenced ended over pizza, chianti and sambucca. Just the way things should end!

Written and submitted on the Rome to Milan Eurostar (having just gone through Bologna!)

A Manifesto for the OSM Academic Working Group

A fellow member of the OSM Foundation replied to a conversation on the mailing list: “As a guerrilla academic…“. The context was around a suggestion for increased academic cooperation within OSM. To this end I proposed a new working group for the OSMF: Academic Working Group. This would have the aim of improving the academic quality and communication of those using OSM in their research and facilitating collaboration.

Below is the start of the manifesto. It’s not complete, but it’s a start.

Background

Academic institutions use OSM data. Be it part of their published research or testing hypotheses. Some of the publications are listed on the wiki: http://wiki.openstreetmap.org/wiki/Research. However within OSM and OSMF this research is undertaken under the researchers own initiative. Researchers are looking at OSM through recommendation (supervision) or self interest within their own academic structures. Given the growth of OSM and the research into it, it seems likely that academic interest will widen and grow.

Goals

Support academic research in OSM, encouraging best practices and acting as a forum for researchers. This has the aim to support researchers starting out with OSM but also to unify a community of existing researchers; collaborations and knowledge sharing will hopefully follow. Identification of areas of research for the community as a whole among potential themes of usability and business models (as a starting point).

Tasks

  • Uniting existing researchers, either at existing institutions or those following independent academic study.
  • Provide documentation (a la learnOSM) but focused for researchers.
  • Provide a forum for researchers to discuss their research and bridge into the community
  • Support and provide problems to the academic corpus.
  • Communicate potential collaborations, needs, wants.
  • More TBD

Working Group vs. Community

I think this is hitting a gap that exists in the community currently. I don’t see potential areas for conflict. However that being said do we have enough members within the OSM(F?) to create and steer the working group?

WWWG vs. other WGs

There is a small amount of overlap in interest between this proposed AWG and other Working Groups.  I can see potential overlap with communications and strategic working group. Communications as this would aim to focus on building up the OSM academic community. Strategic as they may wish to commission studies or at least support them, into critical areas of OSM.

Next steps 

Again, I’ll throw this to the OSMF. Where should we go from here?

Written and submitted from the London St. Pancras to Nottingham Train.

Dictator/Benevolent: Janus, Dichotomy?

Taarifa is one of the best things I’ve been involved in. In various forms it’s had shout outs in the New York Times to Random Hack Of Kindness. One of my tasks is to help deploy it in Uganda soon. I recently sent this email to the Taarifa development mailing list. I feel that the role of a founder in a project always needs to be considered. I’m wondering what other people think?

Taarifans!

Taarifa is a platform that is fix my street for slums. I’m unsure whether this post is a massive shout out to my ego or what. However I want to start a discussion on the role of the founder in community projects, be them open source or not. I do so mindful of a line from ‘Batman: The Dark Knight’ ringing loud and clear; “You either die a hero or you live long enough to see yourself become the villain.”

At this juncture I should point out my love for another open source project, Open Street Map. I owe a lot to many members of the OSM community, either directly or indirectly. The OSM project and its community has shaped who I am over the years. From an exchange student dragging his local girlfriend mapping because she had local knowledge to going down some deep rabbit warrens elsewhere in the world. I love OSM, I love what it stands for and I love its community to the point where at one stage my health was severely compromised. However things within haven’t been plain sailing with a simple comment of “We are the Board! Shape the project!” effectively a call to arms for the project’s betterment taken as a powergrab by the board -nb I’m not singling people out for this one, the thread is included for reference, good luck if you reach the end!

In my eyes the OSM and its foundation OSMF are making the world better. Viewing it like accounts, they’re contributing more to the black column than the red. In my eyes Taarifa is doing the same, and should continue to do the same until something better comes along, or the project is dead. A lot of people in the skype channel and email have thanked me for organising Taarifa, going to talk to people and the such like. The truth is I’m just a loud, talky person. At times when things are starting maybe that is what’s needed. In future probably not. Taarifa is potentially going to be a foundation, at the very least it needs to do something around it’s identity and outward communications. It’s to be discussed at the coming hackathon and I think we should welcome it! We need to discuss what we want our structure to be, is it anarchy, benevolent dictator, committee? I don’t know, but together we should. Future plans regarding funding, grants, deployments all come under this, at it’s core where do we see this going?

My input now I think is to create the culture or influence it. I want people to love Taarifa as I do. I think the community and what we’ve done and accomplished is phenomenal. As such I’d like to shout a call-to-arms to Taarifans and other developers looking at Taarifa to JFDI if they believe strongly enough into it. Make things better, by consensus. If that isn’t working, fork the project and show why your solution is better. Then pull. Also difference is good. I believe it is ingrained in the Taarifa community’s inception that by defending and debating our positions this makes OUR project better; Remember the whiteboard sesssions! Which as it’s a humanitarian project, enables better usage and happier users which does ‘GOOD’. We’re getting new members who weren’t at the hackathon – hello there! – joining. Every person I speak to, sees how Taarifa can make a big difference, people in Uganda are hopeful, in some small way, the world is watching!

So what about the position of the founder and the quote at the beginning? Is founder the best way description; in some ways you on this list now are founders. I want to be involved in Taarifa for as long as I can, but not at the forefront. People change, they loose their hunger, they get different skills sets. And this is a good thing! One of the most contentious things I have is a business card where I’m purported to be a “Geospatial Innovation Consultant”. Geospatial Consultant fine, using ‘Innovation’ however is esoteric and buzzword bingo. At some stage in my life I innovated, I took a risk and though it cost me very dearly it apparently paid off. Now I don’t really innovate, I research, I just ‘do’. Not necessarily innovation, that baton has been taken up by someone younger, better looking with ‘nicer’ hair than I. My role should be to help them – whoever they are – to innovate bigger and better than before. I guess I’m seen at the forefront of Taarifa at the moment. But as an open note, if you think you can do it better do it. The project is bigger than me, you and the community. At the moment very deep decisions are being made or will have to be made, and they’re made with the information we have now, not 20:20 hindsight. The best team at the time should be guiding and shaping those decisions, not yesterdays team. At the hackathon, I remember drinking some cola, looking at each of the developers hacking and thinking ” I’m the dumbest guy in the room”. Everywhere on our table people created frameworks or made coordinate reference systems. Really smart things and all of you should be damn proud.

The time will come where I will need to step aside as being shouty. This is a natural process, not requiring politicking or a ‘nasty’ process. So I rally “You are the founders, shape the project, own it”. Personally I’ve only ever been able to see as far because I stood on the shoulders of giants. Your shoulders. Thank you my friends.

Will see YOU at the hackathon!

Mark