Over the past few weeks, I’ve managed to escape the office and get back to the field. With an impending change, it’s been a very refreshing time to get back into the mix – especially out onto the roads of Zanzibar.
To do this, the N/LAB team at University of Nottingham and Spatial Info (the spin out of my team that helped build Ramani Huria/Tanzania Open Data Initiative) and I are working with the Zanzibari Department of Roads, under the Ministry of Infrastructure, Communications and Transport to survey all roads in Ugunja Island Zanzibar.
So far, we’ve worked on getting a surveying vehicle back on the road, initial back and forth with government stakeholders, and working on pulling together the various road data sources (such those from the Government and OpenStreetMap) to work out where to drive and the sequencing of the survey. All of this will support a data collection protocol that merges traditional surveying techniques, with novel ones such as RoadLab Pro.
All of these data streams will then be used as a training dataset to see how machine learning can inform on road quality. But first, we’re getting the traditional survey underway. It’s going to be long road ahead – as long as all the roads in Zanzibar!
I’ve been playing around with the full planet file to look at going back in time in OSM. Mainly, this is to look at how Ramani Huria’s data has evolved over time and is all part of extracting more value from Ramani Huria’s data.
I’ve been playing around with the full planet file to look at going back in time in OSM. Mainly, this is to look at how Ramani Huria’s data has evolved over time and is all part of extracting more value from Ramani Huria’s data. This process wasn’t as straightforward as I had hoped, but eventually got there – also, this isn’t to say that this is the only or best way. It’s the one that worked for me!
To do this, you’ll need a pretty hefty machine – I’ve used a Lenovo x230 Intel i5 quad core 2.6ghz, 16gb of ram with over 500gb of free space – This is to deal with the large size of the files that you’ll be downloading. This is all running on Ubuntu 16.04.
Firstly, download the OSM Full History file. I used the uGet download manager to deal with the 10 hour download of a 60gb+ file over 10meg UK broadband connection. Leaving it overnight, I had a full file downloaded and ready for use. Now to set up the machine environment.
The stack is a combination of OSMIUM and OSMconvert. On paper, the OSMIUM tool should be the only tool needed. However, for reasons that I’ll come to, it didn’t work, so I found a workaround.
OSMconvert is easily installed:
sudo apt-get install osmctools
This installs OSMconvert other useful OSM manipulation tools. Installing OSMIUM is slightly more complicated and needs to be done through compiling by source.
Firstly, install LibOSMIUM – I found not installing the header files meant that compilation of OSMIUM proper would fail. Then use the OSMIUM docs to install OSMIUM. While there is a package included in Ubuntu for OSMIUM, it’s of a previous version which doesn’t allow the splitting of data by a timeframe. Now things should be set up and ready for pulling data out.
Dar es Salaam being the city of interest, has the bounding box (38.9813,-7.2,39.65,-6.45) – you’d replace these with the South West, North West point coordinates of your place of interest, and use OSMconvert, in the form:
Four years ago, in August 2011 I was fortunate to manage the community mapping of Tandale. It was an experience that irrevocably changed my professional direction and interests. Over a month I trained and worked alongside brilliant students and community members, who were all focused on getting an open map of Tandale, something that had never been accomplished previously. When it was done, the reception across civil society and government was positive and intentions on scaling the pilot to the city were mooted but for one reason or another it never quite made it. Then in December, floods hit the city. In dense informal urban environments such as Tandale these floods are fatal and dramatically change the landscape as well as causing mass damage to survivor’s livelihoods and assets. Mitigating these floods are hard – where do you start in the fastest growing city in Africa? The population as of the 2012 census currently stands of 5 million, with projections showing it could grow to 10 million by 2030.
This rapid and unplanned urbanisation is in part the cause of flooding: the infrastructure with which to cope with high rainfall, such as drains and culverts, were not built alongside residential dwellings. This is especially acute in the unplanned, informal urban settlements where a majority of Dar es Salaam’s residents reside. The theory here is quite simple: If that if you can identify where it floods, you can either install or upgrade infrastructure to ameliorate the situation for residents. Unpacking this, the crux of the issue falls to two main points, governance and data.
Ramani Huria – Swahili for “Open Mapping” – is a operationalization of this theory of change. In March 2015, a coalition from across Tanzanian society, composed of the City Council of Dar es Salaam, the Tanzanian Commission for Science and Technology (COSTECH – under the Ministry of Science, Communication and Technology), the University of Dar es Salaam, Ardhi University, Buni Innovation Hub supported by the Red Cross and World Bank supported the inception of Ramani Huria, with the goal of mapping flood prone areas in Dar es Salaam, making this data openly available and supporting the use of this data into government where decisions can be made to mitigate flooding.
It is a far cry from 2011 where just mapping the ward of Tandale was a large task. Ramani Huria consists of a pilot phase and four subsequent phases. To pilot, the wards Ndugumbi, Tandale and Mchikichini, with a combined population of over 100,000 residents were mapped in series. This process combined 15 students matched with community members, leading to maps of all features within that community. This information, focusing on drainage and water ways, is critically needed to help understand and locate flood prone areas; this is high priority in Dar es Salaam due to the damage that annual floods wreak upon the city and its residents. In this piloting phase, conducted from March to the end of June these three wards were mapped, in part to generate the data that will generate flood inundation models and exposure layers but also to pilot the data model and gel the team, prior to Phase One.
Phase one on paper is quite simple. Take 150 students from the University of Dar es Salaam’s Department of Geography and Ardhi University’s School of Urban and Regional Planning on industrial training, hold an inception workshop, deploy this contingent across six wards and work with community members to replicate the pilots, but running in parallel. At the time of writing, mapping is ongoing in six communities: Msasani, Keko, Makumbusho, Mabibo, Makurumla and Mburahati. According to the 2012 NBS census, these wards have a combined population of over 280,000 residents. Phase one was kicked off on the 6th of July and will run until the 14th of August.
Phases Two and Three, will integrate community volunteers from the Red Cross, these volunteers are committed to creating community level resilience plans. These plans will use the data produced by the mapping to create resident evacuation routes and aid Ward Exective Officers with planning decisions among many other uses. Additionally, with embedded long term volunteers monitoring change in their wards, this will hopefully result in detailed up-to-date maps in rapidly changing urban areas.
Phase Four unfortunately sees the students depart from the project, due to their graduation. With a remaining contingent of around 30 mappers, mapping will continue until February 2016. These phases cover the data component, consequently alongside these phases are dedicated training events aimed at building capacity to use and deploy this data in real world situations. On the 20th July the first such workshop series took place, with representatives from the Prime Minister’s Office for Disaster Management Department being trained in spatial analysis in QGIS and risk modelling using the QGIS plugin InaSAFE. A series of these workshops will take place, placing the data into the hands of those responsible for the city.
While this is ongoing in Dar es Saalam, you could get involved wherever you are in the world, through the Missing Maps project. Missing Maps is a collaboration between the Red Cross, Doctors Without Borders and Humanitarian OpenStreetMap Team, aimed at digitising “the most vulnerable places in the developing world”, but primarily do so by crowdsourcing the digitisation of aerial imagery. At the moment, there are three tasks for Dar es Salaam:
By helping digitise the buildings and roads, using the recent drone and aerial imagery, the process of mapping is faster, allowing the community mappers to focus on the detail of flood data. Additionally, the data from Ramani Huria is all placed into OpenStreetMap, its code is on Github and content available from Flickr and Facebook, all with an open licence. Please get involved!
Written on a plane somewhere between Tanzania and the United Kingdom
On the 3rd to the 5th of April I attended GISRUK (Geospatial Information Research in the United Kingdom) to give a paper on Community Mapping as a Socio-Technical Work Domain. In keeping with Christoph Kinkeldey‘s love of 1990s pop stars Vanilla Ice made a second slide appearance, leveraging the fact it’s a very technical academic title. In short I’m using Cognitive Work Analysis (CWA) to create a structural framework to assess the quality (currently defined by ISO 19113:Geographic Quality Principles – well worth a read…) where there is no comparative dataset.
CWA is used to assess the design space in which a system exists, not the system itself. In taking a holistic view and not enforcing constraints on the system you can understand what components and physical objects you would need to achieve the values of the system and vice-versa. In future iterations I’m going to get past first base and look at decision trees and strategic trees to work out how to establish the quality of volunteered geographic data without a comparative dataset. Building quality analysis into day one, as opposed to being an after thought.
Written and submitted from Home (52.962339,-1.173566)
Last year I attended the H4D2 (Humanitarian for Disaster 2.o) organised by (and at) Aston University and Geeks Without Bounds. One of the outputs that I worked on was the HXL Extractor. Basically take data out of GeoSPARQL, a geospatial semantic database and fire it into a GIS program. One of the team members had already been experimenting with and semantic databases and triplestores (this was most definitely a good thing, allowing us to move quickly) so our ‘mission’ was to create a middle layer to connect to a triplestore, then using the WFS-T standard to fire the extracted data into a GIS program of your choice. Interestingly the ‘project lead’ was communicating with us from Geneva via Skype, this and the prior work bellies the need for clear and concise problem statements prior to the hack. Because some of the team had been able to think about what they had to do we’d been able to work more effectively, even while learning technologies on the fly.
Going to the International Conference for Crisis Mapping Hackathon in Washington a few months later, HXL was still going strong and I got to meet the instigator of the project CJ Hendrix face to face. He’d amassed a team which went on to rightly take first prize at ICCM, now its being used by by UNOCHA with papers forthcoming. The project is growing, as evidenced by the amount of work going on in the team repository. Understandably our small team in Birmingham just did a little bit, but every little bit, helps.
Now H4D2 is coming around again on April 12th – 14th. This will then be followed up by SMERST (Social Media and Semantic Technologies in Emergency Response) a more academic focused conference on April 15th – 16th. Most importantly, you didn’t need to code to contribute, all are welcome from designers, videographers, bloggers, journalists and you! Registration for the H4D2 is open and is again at Aston University in Birmingham. Register here: http://h4d2.eu/registration. It’s going to rock.
WhereCampEU this year, rather earlier than normal, was in the Eternal City of Rome, Italy. After the threatening of Snowmeggeddon in the UK, a jaunt to Italy was a welcome respite. An action packed unconference timetable started with a presentation on Taarifa by myself. This was a follow on presentation from W3G but focusing on the characteristics of developing technology; needing to know the users and how they’ll use the ‘solution’. Developing solutions to first world problems then applying in the developing world isn’t useful and is dangerous, however, is the method de jure in some organisations.
A presentation on how the World Food Program uses the OpenDataKit, for collecting information in South Sudan followed. It would have been interesting to have heard more about the rationale and why they were using what they were using. The use-case was a take picture, see what is about, the intelligence that they sought to gather. However, the presenter didn’t stay around, so if anyone in the geo-sphere knows, please get in touch!
CartoDB was given a live demonstration. We’re quickly moving past the desktop for GIS and spatial analysis and into the cloud. I’d like to know how these cloud based GIS services compare with ESRIs and MapBox’s offerings. It’s a brave new world!
Michael Gould‘s 37 things you didn’t know about ESRI was a passionate talk about ESRI from its inception to the present day. A leviathan in the GIS space, the culture is seemingly anything but corporate America. In the examples mentioned the social conscious dominates decisions; from the positing of boulders on the ESRI campus to the acquisition of new companies.
A Taarifa breakout design session occurred with a special guest appearance from a snow-bound London. But more on this in a later blog post.
The day ended with an OSM Q&A by myself and Shaun McDonald turned into a wide ranging discussion about the OSM project and the challenges within. Getting new contributors to keep contributing was one point of discussion as was the need for improved internationalisation and languages.
An evening of Pizza, Dolcé and Grappa followed. The night ended in a spectacular deli/bistro/bar known only to locals and lost where campers. Bottles of Chanti and Prosecco were enjoyed and toasts made.
Standing out the following day was Laurence Penny‘s updated 1-D Maps . It’s never the same things, constantly reinventing itself with from the acquisitions and collection held by Laurence. Going from Doom, the Mille Miglia to Roman Era Road Routing with a detour around the metros and undergrounds. It was 2 hours long. Words fail to describe the brilliance that emanates from the presentation. I really look forward to seeing it in an updated form.
A certain Henk Hoff of the OSM Foundation, brought proceedings to a close on a wide ranging discussion on the foundation, how it functions and operates. The day and conferenced ended over pizza, chianti and sambucca. Just the way things should end!
Written and submitted on the Rome to Milan Eurostar (having just gone through Bologna!)
A fellow member of the OSM Foundation replied to a conversation on the mailing list: “As a guerrilla academic…“. The context was around a suggestion for increased academic cooperation within OSM. To this end I proposed a new working group for the OSMF: Academic Working Group. This would have the aim of improving the academic quality and communication of those using OSM in their research and facilitating collaboration.
Below is the start of the manifesto. It’s not complete, but it’s a start.
Academic institutions use OSM data. Be it part of their published research or testing hypotheses. Some of the publications are listed on the wiki: http://wiki.openstreetmap.org/wiki/Research. However within OSM and OSMF this research is undertaken under the researchers own initiative. Researchers are looking at OSM through recommendation (supervision) or self interest within their own academic structures. Given the growth of OSM and the research into it, it seems likely that academic interest will widen and grow.
Support academic research in OSM, encouraging best practices and acting as a forum for researchers. This has the aim to support researchers starting out with OSM but also to unify a community of existing researchers; collaborations and knowledge sharing will hopefully follow. Identification of areas of research for the community as a whole among potential themes of usability and business models (as a starting point).
Uniting existing researchers, either at existing institutions or those following independent academic study.
Provide documentation (a la learnOSM) but focused for researchers.
Provide a forum for researchers to discuss their research and bridge into the community
Support and provide problems to the academic corpus.
I think this is hitting a gap that exists in the community currently. I don’t see potential areas for conflict. However that being said do we have enough members within the OSM(F?) to create and steer the working group?
WWWG vs. other WGs
There is a small amount of overlap in interest between this proposed AWG and other Working Groups. I can see potential overlap with communications and strategic working group. Communications as this would aim to focus on building up the OSM academic community. Strategic as they may wish to commission studies or at least support them, into critical areas of OSM.
Again, I’ll throw this to the OSMF. Where should we go from here?
Written and submitted from the London St. Pancras to Nottingham Train.