The work and mumblings of Mark Iliffe

Latest

GDAL Nodata and Drone Processing Pipeline

Due to a nodata tag issue within the GeoTiff standard, Pix4D doesn’t generate the nodata value in a GDAL compliant manner. As such, when importing GeoTiffs into software using GDAL (like QGIS or Mapbox) the nodata value is black, meaning overlapping layer issues. Additionally, for an unknown reason, Mapbox does not deal with multiple GeoTiffs in the same manner as it displays .mbtiles, pixellating uploaded GeoTiffs. This short blog presents a processing pipeline to deal with both of these issues.

The solution for translating the nodata values has been documented here: https://geozoneblog.wordpress.com/2013/02/26/troubles-tiff-no-data-values-floating-point. I’ve written a simple bash script to convert this in a UNIX environment is here: https://github.com/WorldBank-Transport/TanzaniaDrones/blob/master/transparency_batch.sh. Basically, run the script in the folder where files are found and it will iterate through each file, translating the nodata tag value to be GDAL compliant.

Once you’ve got the nodata/transparency values resolved, it’s quite easy to add your Geotiffs to Mapbox’s Tilemill. Export your resulting map as .mbtiles and upload this to Mapbox as a tiles. In the Mapbox Studio Classic interface, add this to a map and voilà, a drone imagery online.

Data Driven Governance

At the Africa Open Data Conference I was fortunate to have a chat with Juliana Letara, the Town Planner for Kinondoni Municipality and Osiligi Lossai, the Ward Executive Officer of Tandale. We discussed the recent community mapping, how they are beginning to use the maps and the data and some unexpected outcomes.

A longer blog will be incoming on how data is being used to support governance in Dar es Salaam. Personally, I’m still digesting some of the implications and potential of what they’re discussing, regarding the potential impact that Ramani Huria (also here http://ramanihuria.org) could have and is having in Dar es Salaam.

Starting Ramani Huria – Mapping The Flood Prone Areas In Dar es Salaam

Four years ago, in August 2011 I was fortunate to manage the community mapping of Tandale. It was an experience that irrevocably changed my professional direction and interests. Over a month I trained and worked alongside brilliant students and community members, who were all focused on getting an open map of Tandale, something that had never been accomplished previously. When it was done, the reception across civil society and government was positive and intentions on scaling the pilot to the city were mooted but for one reason or another it never quite made it. Then in December, floods hit the city. In dense informal urban environments such as Tandale these floods are fatal and dramatically change the landscape as well as causing mass damage to survivor’s livelihoods and assets. Mitigating these floods are hard – where do you start in the fastest growing city in Africa? The population as of the 2012 census currently stands of 5 million, with projections showing it could grow to 10 million by 2030.

This rapid and unplanned urbanisation is in part the cause of flooding: the infrastructure with which to cope with high rainfall, such as drains and culverts, were not built alongside residential dwellings. This is especially acute in the unplanned, informal urban settlements where a majority of Dar es Salaam’s residents reside. The theory here is quite simple: If that if you can identify where it floods, you can either install or upgrade infrastructure to ameliorate the situation for residents. Unpacking this, the crux of the issue falls to two main points, governance and data.

Ramani Huria – Swahili for “Open Mapping” – is a operationalization of this theory of change. In March 2015, a coalition from across Tanzanian society, composed of the City Council of Dar es Salaam, the Tanzanian Commission for Science and Technology (COSTECH – under the Ministry of Science, Communication and Technology), the University of Dar es Salaam, Ardhi University, Buni Innovation Hub supported by the Red Cross and World Bank supported the inception of Ramani Huria, with the goal of mapping flood prone areas in Dar es Salaam, making this data openly available and supporting the use of this data into government where decisions can be made to mitigate flooding.

Mapping Phases

Mapping Phases

It is a far cry from 2011 where just mapping the ward of Tandale was a large task. Ramani Huria consists of a pilot phase and four subsequent phases. To pilot, the wards Ndugumbi, Tandale and Mchikichini, with a combined population of over 100,000 residents were mapped in series. This process combined 15 students matched with community members, leading to maps of all features within that community. This information, focusing on drainage and water ways, is critically needed to help understand and locate flood prone areas; this is high priority in Dar es Salaam due to the damage that annual floods wreak upon the city and its residents. In this piloting phase, conducted from March to the end of June these three wards were mapped, in part to generate the data that will generate flood inundation models and exposure layers but also to pilot the data model and gel the team, prior to Phase One.

Phase one on paper is quite simple. Take 150 students from the University of Dar es Salaam’s Department of Geography and Ardhi University’s School of Urban and Regional Planning on industrial training, hold an inception workshop, deploy this contingent across six wards and work with community members to replicate the pilots, but running in parallel. At the time of writing, mapping is ongoing in six communities: Msasani, Keko, Makumbusho, Mabibo, Makurumla and Mburahati. According to the 2012 NBS census, these wards have a combined population of over 280,000 residents. Phase one was kicked off on the 6th of July and will run until the 14th of August.

Phases Two and Three, will integrate community volunteers from the Red Cross, these volunteers are committed to creating community level resilience plans. These plans will use the data produced by the mapping to create resident evacuation routes and aid Ward Exective Officers with planning decisions among many other uses. Additionally, with embedded long term volunteers monitoring change in their wards, this will hopefully result in detailed up-to-date maps in rapidly changing urban areas.

Phase Four unfortunately sees the students depart from the project, due to their graduation. With a remaining contingent of around 30 mappers, mapping will continue until February 2016. These phases cover the data component, consequently alongside these phases are dedicated training events aimed at building capacity to use and deploy this data in real world situations. On the 20th July the first such workshop series took place, with representatives from the Prime Minister’s Office for Disaster Management Department being trained in spatial analysis in QGIS and risk modelling using the QGIS plugin InaSAFE. A series of these workshops will take place, placing the data into the hands of those responsible for the city.

While this is ongoing in Dar es Saalam, you could get involved wherever you are in the world, through the Missing Maps project. Missing Maps is a collaboration between the Red Cross, Doctors Without Borders and Humanitarian OpenStreetMap Team, aimed at digitising “the most vulnerable places in the developing world”, but primarily do so by crowdsourcing the digitisation of aerial imagery. At the moment, there are three tasks for Dar es Salaam:

By helping digitise the buildings and roads, using the recent drone and aerial imagery, the process of mapping is faster, allowing the community mappers to focus on the detail of flood data. Additionally, the data from Ramani Huria is all placed into OpenStreetMap, its code is on Github and content available from Flickr and Facebook, all with an open licence. Please get involved!

 

Written on a plane somewhere between Tanzania and the United Kingdom

Putting Crowdsourcing In Action


Crowdsourcing is increasing in popularity as a form of distributed problem solving enabled by digital technologies. “The crowd” is invited to contribute towards projects, this contribution potentially being in the form of knowledge or design skills. On June the 3rd this year an interdisciplinary workshop investigating crowdsourcing and citizen science convened. It brought together experts and practitioners from many disciplines that apply a crowdsourcing approach, presenting outputs and how crowdsourcing aids projects from GalaxyZoo (an interactive project for volunteer classification of galaxies), Artmaps (an application for crowdsourcing information on Tate digital artworks) and Taarifa (a platform and community supporting the crowdsourcing of public service issues in the developing and developed world).

My personal presentation was on Taarifa, a project I started in 2011 to support community based public service delivery. Since then I’ve worked in collaboration with the World Bank in Uganda to support the Education and Local Government Ministries with reporting across the country; what started as a pilot was rolled out quickly to cover 111 districts, over a year of an at-scale pilot 14,000 reports were received and acted upon. This resulted into wider research into the wider use of public participatory service delivery in developing countries (FOSS4G Taarifa paper). The uniqueness of Taarifa is that it has been developed and maintained by wholly volunteer contributors, creating free and open source software. The contributors to Taarifa are as diverse as the problems which Taarifa addresses, ranging from PhD Candidates like myself to Physicists, Bankers and Community Organisers. Consequently, Taarifa doesn’t just look after a platform of software; it acts as a forum to share knowledge, experimentations and innovation.

Taarifa was conceived at the London Water Hackathon, as an innovation around water access and quality in Tanzania. Access to water in Tanzania currently covers less than 50% of the country’s population and with 38% of the water infrastructure, like taps, are graded as non-functional. Currently the Ministry of Water has a WPMS (Water Point Mapping System) developed after a countrywide survey. However, the system has is no functionality to update the status of the water point or view a history of service problems. This is combined with poor performance of repairs nationally if water points are repaired; citizens are disenfranchised with current methods of reporting water faults, if they can report at all. The ecosystem around supporting the repair of water points is non-existent; consequently millions of Tanzanians have no access to publicly delivered water.

It is important to stress that there isn’t ‘one’ solution to the problem of water access nor is there ‘one’ platform or software to ‘fix’ the problem. There is no one discipline that can resolve the issue of water access, there has to be a multidisciplinary approach, to a multidisciplinary problem. Cartography, Economics, Engineering, not one discipline can wholly resolve the issue of water access, nor it is an issue which can be researched and resolved through the lens of one discipline. The societal side of technology needs to not be just taken into account, but integrated into the core of the design with the people who face the issue and who will use the technology to resolve it. It is imperialistic and deterministic to assume that technology can just ‘fix’ the kind of complex issue of water access, especially as the technology is, in effect, imposed broadly by outsiders to the community in which it is intended to take root. Hence, an understanding of the community is needed; who the users are, how water access is dealt with currently and the general state of affairs. From this we have created two streams of Taarifa, one that is currently implemented and one that is currently being designed, incorporating lessons learned from initial deployments.

The first iteration of Taarifa’s design story and user action assumed that mobile connectivity wasn’t an issues and that there was an active organisation, be it government or an NGO wanting to resolve water access issues, another predicate was that the water infrastructure was adequately mapped. This led to the following reporting process for a water issue; When a report is made, for example from a Community Water Officer or a concerned citizen, it goes into the Taarifa workflow, which identifies the specific water point from the database. The reporter is then notified, thanked that they have made a report and given an estimation, based on prior time taken to repair broken water points in that district on how long it will take. An engineer is informed what is wrong with the water source. Once an engineer has been selected, a verifier can verify that repair has been completed satisfactorily. Importantly, at each stage the initial reporter is informed about the progress of the repair. This was the version trialled in Uganda.

Subsequently, learning from how Taarifa was deployed and used, the design is now intended to incorporate offline capability and ‘marketplace’ functionality. The offline capability due experiences in Uganda that connectivity wasn’t universal (this was not a surprise, however, improvements to the paradigm should be incremental) the marketplace due to the capacity of local government and organisations. If a district has no capacity to repair a broken water point, the cost could be estimated by a number of engineers receiving information about the problem and they bid using their phones. A micropayment is taken to support the system, providing a surplus, potentially reinvested into creating new capacity. Micropayments are ubiquitous in the developing world, effectively replacing a formal banking infrastructure, hence are familiar to the communities who will use this method. Consequently this should hopefully be viewed as an extension of what already exists, not something completely new.

What does all this mean within the context of the Crowdsourcing in Action workshop? Broadly it allows us, as academic researchers to typify crowdsourcing and understand more. Taarifa acts as a community crowdsourcing code and by extension curating community reporting issues in developing countries. Artmaps develops applications for use on smart phones that will allow people to relate artworks to the places, sites and environments they encounter in daily life. GalaxyZoo leverages the many eyes of the crowd to process space imagery data. Thematically, all the projects presented utilised volunteers to provide information, process it and return it to the user and other interested actors.

After the initial presentations we formed groups, of other experts and practitioners to build a common model of what crowdsourcing means to their projects and work. Then coalescing at periods to feedback practice and information learned from other participants. In doing so, we learn from successes and failures of others, understanding common themes for collaboration.

In identifying these common themes, it hopefully sets an agenda to focus on specific factors and communities under the crowdsourcing agenda. Jeremy Morley and I are planning a future “How to Hackathon” event building “Crowdsourcing in Action”. Hackathons allow volunteers (generally) to co-create ‘hacks’ to problems. In its truest sense you accelerate innovation by combining a random mix of people and skills, providing a set of previously unsolved problems, then observing what happens. As was identified in the Crowdsouring In Action, we can observe the states before crowdsourcing; we can help provide a process for participants; we can observe and process the result. However, an understanding of how participants use the tools to conduct crowdsourcing is scant. By now focusing on hackathons, we hope to discover more on how the design and development of crowdsourcing works.

Prologue to the Sanitation Hackathon

Florian Rathgeber and Fayaz Valli at the World Bank, Washington DC.

Florian Rathgeber (right centre) and Fayaz Valli (left centre) at the World Bank, Washington DC.

Taarifa was announced in various mediums as being a winner of the sanitation hackathon. To this end two Taarifans are currently representing all Taarifans in Washington DC and San Francisco. More will come from this, I’m sure. However, all of the projects of the sanitation hackathon should be given the same pedestal and treatment.

The number of projects and energy that the sanitation hackathon generated should not be lost, and energised by the constant support and coverage;

“The event featured nearly 1000 registered hackers at ten locations worldwide who developed some 62 new prototypes.” – Sanitation Hackathon Site

While this moment is still in the here and now we should all move forward, collaborating, instead of competing. From this solve the technical challenges within the sanitation issues which we face. Undoubtedly, it is a naïve and deterministic proposition to suggest that technology will solve the world’s problems. However, events like the sanitation hackathon have demonstrated that technologists from all walks of life can work together. Building a social side to these systems is a bigger problem than the technological ones, prizes and recognition aren’t replacements and should not be considered replacements for this. The hard work starts now.

Written and submitted in the Hotel Kilimanjaro, Dar Es Salaam, Tanzania (-6.81669, 39.293198)

GISRUK 2013

On the 3rd to the  5th of April I attended GISRUK (Geospatial Information Research in the United Kingdom) to give a paper on Community Mapping as a Socio-Technical Work Domain. In keeping with Christoph Kinkeldey‘s love of 1990s pop stars Vanilla Ice made a second slide appearance, leveraging the fact it’s a very technical academic title. In short I’m using Cognitive Work Analysis (CWA) to create a structural framework to assess the quality (currently defined by ISO 19113:Geographic Quality Principles – well worth a read…) where there is no comparative dataset.

CWA is used to assess the design space in which a system exists, not the system itself. In taking a holistic view and not enforcing constraints on the system you can understand what components and physical objects you would need to achieve the values of the system and vice-versa. In future iterations I’m going to get past first base and look at decision trees and strategic trees to work out how to establish the quality of volunteered geographic data without a comparative dataset. Building quality analysis into day one, as opposed to being an after thought.

Written and submitted from Home (52.962339,-1.173566)

 

H4D2 April 12th – 14th

The HXL-Team

The HXL-Team

Last year I attended the H4D2 (Humanitarian for Disaster 2.o) organised by (and at) Aston University and Geeks Without Bounds. One of the outputs that I worked on was the HXL Extractor. Basically take data out of  GeoSPARQL, a geospatial semantic database and fire it into a GIS program. One of the team members had already been experimenting with and semantic databases and triplestores (this was most definitely a good thing, allowing us to move quickly) so our ‘mission’ was to create a middle layer to connect to a triplestore, then using the WFS-T standard to fire the extracted data into a GIS program of your choice. Interestingly the ‘project lead’ was communicating with us from Geneva via Skype, this and the prior work bellies the need for clear and concise problem statements prior to the hack. Because some of the team had been able to think about what they had to do we’d been able to work more effectively, even while learning technologies on the fly.

Going to the International Conference for Crisis Mapping Hackathon in Washington a few months later, HXL was still going strong and I got to meet the instigator of the project CJ Hendrix face to face. He’d amassed a team which went on to rightly take first prize at ICCM, now its being used by by UNOCHA with papers forthcoming. The project is growing, as evidenced by the amount of work going on in the team repository. Understandably our small team in Birmingham just did a little bit, but every little bit, helps.

Now H4D2 is coming around again on April 12th – 14th. This will then be followed up by SMERST (Social Media and Semantic Technologies in Emergency Response) a more academic focused conference on April 15th – 16th. Most importantly, you didn’t need to code to contribute, all are welcome from designers, videographers, bloggers, journalists and you! Registration for the H4D2 is open and is again at Aston University in Birmingham. Register here: http://h4d2.eu/registration. It’s going to rock.

Written and submitted from the Serena, Dar Es Salaam (6.810617, 39.288284)

Follow

Get every new post delivered to your Inbox.

Join 1,931 other followers