Skip to content
Uhurulabs
  • ICT Consultancy
    • Data backup and security
    • Disaster Recovery
    • Infrastructure design and implementation
    • Data Processing
  • UAV Services
    • Survey & Mapping
    • Agriculture
    • Mining, Quarries & Aggregates
    • Engineering & Construction
    • Training
    • Permit Application
    • Our Tools
  • Projects
    • Lake Victoria Challenge
    • Building Drones
    • 3D Printing
  • About us
    • Mission and Vision
    • Meet the Team
      • UAV Pilots
    • Contact us
  • Blog

Some OpenData

  • 3rd January 20194th July 2020
  • by Frederick Mbuya

OpenData has been a hot topic for the past few years, it means many different things to different people, the commercial sector see’s, (should see?), the potential to leverage OpenData for the creation of new products and services which they can sell to make money, e.g AirBnb and Uber both rely on access to other companies spatial data, (maps). The public service sector should be able to use the data to create better public services, and government decision makers should be able to use the data to make informed decisions.

Note the use of words like “use” and “leverage”, both imply the ability to access the data, and more specifically to be able to access the data in a Machine Readable format. As you can imagine the data we are talking about comes from many different sources, and will actually exist in repositories/databases all over the internet, maintained and contributed to by many different actors. The different user groups who need to use the data will most likely do so with their own specialized tools, be it ArcGIS, QGIS, STATA, R etc, so it is critical that their tools can talk directly to the data they need, thus the need for the data to be in a Machine Readable format.

One of the issues, especially in Tanzania, (and probably for many other regions of the world),  is who/what is going to seed that initial collection of data AND who/what is going to make sure that the people who could potentially use the data are aware of it and can actually access and use it. Increasingly donor funded projects are facilitating the collection of huge datasets, the Zanzibar Mapping Initiative, is an excellent example of that, where the whole island was mapped using small scale drones. This for the most part by Zanzibari’s for Zanzibar, and the data has been made available as OpenData via the ZMI Geonode, and OpenArialMap. This in itself is a great win, especially as the approach taken was to train Zanzibari students and surveyors so if nothing else it has increased capacity. We here at Uhurulabs rely on our commercial contracts as they allow us to subsidize our public/innovation sector work and for our commercial work we rely on professional human resource, see  our pilots, two of which are from Zanzibar and products of the ZMI project.

So in the Data collection has been seeded and done, and it is available as OpenData, how has it been used so far? The data has been used for things like building footprint digitization, we hope to provide a link to that soon. It has also been used by the Zanzibar Commision of Lands for land use and city planning, we also hope to provide links to information about that. Another very exciting use case has been in the Open AI Tanzania Challenge. which invited data scientists to develop feature detection algorithms that can automatically identify buildings and building types using high-resolution aerial imagery. It is our hope and expectation that the classifiers that were developed will be released as OpenSource and as such we have asked on in the Challenge Forum.

Another very exciting initiative is Ramani Huria…

“Ramani Huria is a community-based mapping project that began in Dar es Salaam, Tanzania, training university students and local community members to create highly accurate maps of the most flood-prone areas of the city. As the maps have taken shape – their benefits have multiplied and their potential magnified, now serving as foundational tools for development within all socio-economic spheres beyond flood resilience. The project is supported by funding from the UK Department for International Development through the Tanzania Urban Resilience Programme“

see website

We were lucky enough to be involved in this project, most of our work involved collected data using drones. It always seemed that to some degree the drones stole away some of the attention from the _real_ work that was going on, on the ground. Everyone was very impressed with the quality, speed and accuracy that drones were able to capture high quality aerial images, but the real rich, actionable data came from the boots on the ground. The data includes things such as if a particular house has been flooded, and if so to what height, if a structure is public or private etc. What is really special about this data is its OpenData,  the collection of which has been funded by public money, and done for the most part by specially trained university students and now available for free for public servants to make informed decisions, and business to create new services from.

In the maps below you will see the standard OpenStreetMap basemap, all the data from Ramani Huria was contributed to OpenStreetMap. However you will notice more detailed layers on top  that give you more information and allow a deeper zoom level. You can then use the feature info tool (i), to get detailed information on the various assets, e.g clicking on a building will give you information such as if it is residential or not.

Vingunguti

Tandale

See more at http://geonode.uhurulabs.org/

It is not however just donor lead projects that are releasing OpenData. Tanzania joined the Open Government Partnership, (OGP) in 2011.

Tanzania declared its intention to join OGP during the launching meeting in September 2011, one of six in Africa that qualified to be involved in the OGP.

source: https://www.opengovpartnership.org/stories/tanzania-and-ogp-update

Source: https://www.opengovpartnership.org

Since then it went on to launch websites/services such as:

  • http://www.wananchi.go.tz: A platform to provide feedback on government services
  • https://wpm.maji.go.tz: A water point mapping system.
  • http://www.egov.go.tz/howdoi: Which provides information on how to acquire various public services
  • http://www.data.go.tz: A government OpenData portal

Below is a quick map showing Tanzania’s international boundaries, districts and regions, it has been put together using OpenData downloaded from The National Bureau of Statistics (NBS).

Tanzania

See more at http://geonode.uhurulabs.org/maps/396/view

So data is being generated, and it is even been made available in some machine readable form or another, this is great. What now? If you have a look at points (2) and (3) on the OpenData Handbook website in the section of “How to make your data open” it states

  • Make the data available – in bulk and in a useful format. You may also wish to consider alternative ways of making it available such as via an API.
  • Make it discoverable – post on the web and perhaps organize a central catalog to list your open datasets.

Note the words in RED, again these words get thrown around quite a lot but what do they mean in this context? 99% of the people who use an API will never know they are using it, it simply means that the data is stored in such a way that the software YOU use to get your job done can access it, without you needing to know the technical details of HOW its being accessed. An example of this is when you go to google maps and search for a location, and then tell the app to give you directions to there, you don’t need to know the technical details of HOW that is happening, or how it know’s that there is high traffic on a certain route. You simply want to enter a location and click a button, and it is the fact that many systems make their data available via an API, that allows it to be possible.

What does discoverable,mean in this context? Simply that a user or system is able to find the data when it needs it. This can be as simple as a search box on a website, or more complex e.g system that pushes you information based on your location and previously identified preferences.

Which now actually brings us the main point behind this post, yes it is true that Tanzania has made great strides in its ability to collect, valuable geospatial and other data. It has also made good progress in its ability to then make that collected data Open. Where I think it is now struggling is in the ability to store, manage and reliably make that data available in easy, useful, discoverable systems using things such as API’s and well crafted interfaces that can be accessed by all stakeholders. This however is a struggle that is not Tanzania’s alone, the whole world is seeing an explosion in the availability of data, it is coming from all kinds of sources, drones, new low orbiting satellites, weather stations, from government, private sector, individual citizens etc.  At the same time there is an increasing awareness that good decision making comes from the utilization of good data. Last month we were invited to speak at the Smart Land Administration forum in Finland, where the topic of the day was Spatial Data Infrastructures, (SDI),

A spatial data infrastructure (SDI) is a data infrastructure implementing a framework of geographic data, metadata, users and tools that are interactively connected in order to use spatial data in an efficient and flexible way. Another definition is “the technology, policies, standards, human resources, and related activities necessary to acquire, process, distribute, use, maintain, and preserve spatial data”.[1]

A further definition is given in Kuhn (2005):[2] “An SDI is a coordinated series of agreements on technology standards, institutional arrangements, and policies that enable the discovery and use of geospatial information by users and for purposes other than those it was created for.”

source: Wikipedia

Note that from the above definition it is clear that the weight is on the institutional policies, standards,  and agreements which need to exist. Without good strong institutional policies, standards and agreements an SDI or anything like an SDI cannot exist and as such the public will NEVER realize the true potential of the Data revolution. Data will continue to be collected as the market understands it has value, but rather than a tool to liberate the general citizen, it will be a tool to control and oppress.

So what can you do? First of all, be informed, you might not want to take action but at least have your eyes open, ask yourself why for example after all I have said above Tanzania last year decided to pull out of the the Open Government Partnership

source: https://www.opengovpartnership.org/sites/default/files/Tanzania_SC-response-Tanzania-withdrawal_Sept2017.pdf

Keep going to the government websites I mentioned above, use the data, and when you have questions about it ask the relevant body which is very often NBS, the reality is for the most part people in Government are their and willing to help. It is only by the use of such data do people start to understand its value and are motivated to make sure it is kept up to date and available. If you find a website or service to be down then inform the owner.

At Uhurulabs we are committed to servicing and helping anyone who wants to use data and technology for the betterment of Tanzania and its people, as such we are committed to maintaining a number of services, the first of which is now live and is a Geonode that is available for anyone to host geospatial data, and for anyone who wants to access the data.

https://geonode.uhurulabs.org

Our resources are limited and as such we will try to scale the service according to demand, if you experience any problems please let us know by emailing to geonode@uhurulabs.org. For the most part what we will be trying to do is aggregate datasets that we find in other places, especially

  • data that is not available via API
  • data that we believe is at risk of being lost
  • data we have collected ourselves.

The second is one which came to mind while writing this article, and it will be a page that monitors the various websites and services that provide access to data on Tanzania.

Share this:

  • Click to share on X (Opens in new window) X
  • Click to share on Telegram (Opens in new window) Telegram
  • Click to share on Pocket (Opens in new window) Pocket
  • Click to share on Pinterest (Opens in new window) Pinterest
  • Click to share on Tumblr (Opens in new window) Tumblr
  • Click to share on Reddit (Opens in new window) Reddit
  • Click to share on Facebook (Opens in new window) Facebook
  • Click to share on LinkedIn (Opens in new window) LinkedIn
  • Click to share on WhatsApp (Opens in new window) WhatsApp
  • Click to print (Opens in new window) Print

Why I am so disapointed in the Black Panther…

  • 23rd March 20182nd January 2019
  • by admin

I am probably going to get allot of flack for this post but I have been thinking this since I was about 30 mins into the movie.

First of all I almost never go to the movie theater, prior to going to see Black Panther I had not been to the movies in over 3 years.

However both my wife and I thought that this could be a really good movie that we could both enjoy.

Onto the point however, why I think that the movie was a huge disapointment and such a missed opportunity, I will keep it brief and to the point …

  1. The premise that a nation of Africans would exist in Africa and choose not to get involved or intervene in anyway as they see there fellow African brothers and sisters being used and abused by much of the rest of the world. Fine this is not something that is highlighted in the movie, i.e we do not see the Wakanda during the period of colonization and slavery but it is something that for sure came to my mind when thinking about the backstory. I must also say that this is something that I could have looked past and “ignored”, creative license and all that, if it was not for the rest of the issues with the movie.
  2. The fact that the story the movie tries to sell is that the concept of “helping the rest of the world”, came from a Wakandan who had been living in the states, who was then killed by his own brother the king, (hope I am getting that right). Then years later its the son of the slain brother who returns to Wakanda, (as the villain), and single handedly manages to overthrow the current king and is intent on using the Wakanda technology to make the Wakanda an active dominant world power. Ofcourse the villain never wins and the rightful king gets power again. Such a missed opportunity, why not make the cousins, (i.e the rightful king and the “villain”), end up uniting and together have Wakanda take its rightful place in the world and leverage its technology in a positive way to help the world?
  3. So we have morally questionable Wakanda’s who have allowed their brothers and sisters in neighbouring countries suffer without intervening, I say “morally questionable” because in my opinion standing by and allowing your fellow humans to be abused and suffer when you have the means to help is “morally questionable”. And the villain who manages to take control, but then as all good stories go looses to the rightful king in the end. These events however were seemingly enough to wake up the Wakandan people to the need to help their brothers and sisters in the world who are suffering. So where do they go to help?…. inner city USA!

Share this:

  • Click to share on X (Opens in new window) X
  • Click to share on Telegram (Opens in new window) Telegram
  • Click to share on Pocket (Opens in new window) Pocket
  • Click to share on Pinterest (Opens in new window) Pinterest
  • Click to share on Tumblr (Opens in new window) Tumblr
  • Click to share on Reddit (Opens in new window) Reddit
  • Click to share on Facebook (Opens in new window) Facebook
  • Click to share on LinkedIn (Opens in new window) LinkedIn
  • Click to share on WhatsApp (Opens in new window) WhatsApp
  • Click to print (Opens in new window) Print

Working on the edge…

  • 8th March 20182nd January 2019
  • by Frederick Mbuya

Problem…

UAV data has been collected over a large area +1,500 sq/km. The UAV’s used were small and as such the area was divided into 200+ zones each of which were then processed into individual Geotiffs. The data was collected without absolute accuracy and as so although the data within a given zone is relatively accurate there are varying degrees of edge matching issues when attempting to put all 200+ zones together.

Attempted solution…

A post processing process that would attempt to take individual zones and automatically do adaptive filtering of zones and then attempt to match using edge matching.

Test Data…

A 27 Zone section was selected as the same dataset.

Preliminary Result…

3km x 3km Zone.

pre post

at a glance it looks good, so lets look a little closer…

We selected two zones…

 
zone 117 zone 118

 

Zoomed to 1:1000 Scale

pre post

 

Zoomed to 1:5000 Scale

zone 117 zone 118

Zoomed to 1:1000 Scale

zone 117 zone 118

1km x 1km Zone

pre post

Zoomed to 1:5000 Scale

pre post

Share this:

  • Click to share on X (Opens in new window) X
  • Click to share on Telegram (Opens in new window) Telegram
  • Click to share on Pocket (Opens in new window) Pocket
  • Click to share on Pinterest (Opens in new window) Pinterest
  • Click to share on Tumblr (Opens in new window) Tumblr
  • Click to share on Reddit (Opens in new window) Reddit
  • Click to share on Facebook (Opens in new window) Facebook
  • Click to share on LinkedIn (Opens in new window) LinkedIn
  • Click to share on WhatsApp (Opens in new window) WhatsApp
  • Click to print (Opens in new window) Print

Sustainia 4th Global Opportunity Report

  • 1st February 20182nd January 2019
  • by Frederick Mbuya

Sustainia just launched their 4th Global Opportunity Report together with DNV GL and the United Nations Global Compact. Over the years, the Global Opportunity Reports have proved that no challenge or risk is too big for business to tackle, and that there is market potential in every one of the 17 Sustainable Development Goals, from smart farming eliminating hunger to solar micro-grids providing clean energy.

All market opportunities accumulated in the Global Opportunity Reports are now available on our innovation hub Global Opportunity Explorer for everyone to explore. But which market has the biggest positive impact on people, planet and profit?

Get your copy of the report here.

Share this:

  • Click to share on X (Opens in new window) X
  • Click to share on Telegram (Opens in new window) Telegram
  • Click to share on Pocket (Opens in new window) Pocket
  • Click to share on Pinterest (Opens in new window) Pinterest
  • Click to share on Tumblr (Opens in new window) Tumblr
  • Click to share on Reddit (Opens in new window) Reddit
  • Click to share on Facebook (Opens in new window) Facebook
  • Click to share on LinkedIn (Opens in new window) LinkedIn
  • Click to share on WhatsApp (Opens in new window) WhatsApp
  • Click to print (Opens in new window) Print

Drone Journalism School

  • 4th September 20172nd January 2019
  • by Frederick Mbuya

 

So for a couple of days last week I went back to school. A few of us from africanDRONE, namely Unequal Scenes, Microdrone, African Defence Review and of course Uhurulabs, were sponsored by ICFJ to attend a three day intensive Drone Journalism School at the University of Oregon, in Portland.

The conference was organized by Google News Labs, the University of Nebraska School of Journalism and Poynter. It was a very interesting experience as for the most part my work with drones has been for data collection, survey etc, while most in the room were focusing on using drones for video and photography. One thing we all did have in common however was an overwhelming understanding that the rapid commercialization of consumer drones has changed the realm of what’s possible and by whom in ways we could not have thought possible five years ago.

A main focus of the conference was the training for the American FAA Regulations Part 107, which are the new rules for non-hobbyist small unmanned aircraft (UAS) operations in the USA. The conference had several great tutors who took everyone through all the aspects of the regulations in preparation for the attendees to take the exam.

Regulations are the topic of the day, and of great concern to anyone who wishes to operate UAS commercially. All over the world the various regulatory bodies are struggling to create rules and an environment that allow for the safe and fair usage of our skies. A colleague from africanDRONE has said that he thinks the American approach is “pragmatic and economically advantageous“,  this of-course is relative to the South African regulations where it costs up-to 10,000$ to be fully registered and certified to operate UAS commercially. Contrast that with Tanzania where currently there is no payment required for the registration of UAS, there are only procedures you are required to follow with the Tanzanian Civil Aviation Authority (TCAA), and the Ministry of Defence (MoD). I personally have done this a number of times and have found the process straight forward. The one thing I will mention is that until now you have been required to do this for every flight, which can become a burden. I am currently in the process of getting a more permissive permit, that perhaps will only require me to log a flight plan whenever I want to fly, fingers crossed on that!

Lots of people know me as the Drone guy, what many don’t know is that I don’t think I am actually a very good drone pilot, I almost never fly for photography or video and much of my work is automated, where all flights are planned and programmed in the office. I rarely get my hands on the latest consumer drones and even when my friends offer me to try theirs I say no, as I don’t want to crash a $6,000 Inspire! Luckily for me one of the conference sponsors DJI, had brought a number of their latest drones and took us all out for some test flights. The inspire 2 is awesome, but would be a waste in my hands, happy though that I had a chance to finally fly it!

Share this:

  • Click to share on X (Opens in new window) X
  • Click to share on Telegram (Opens in new window) Telegram
  • Click to share on Pocket (Opens in new window) Pocket
  • Click to share on Pinterest (Opens in new window) Pinterest
  • Click to share on Tumblr (Opens in new window) Tumblr
  • Click to share on Reddit (Opens in new window) Reddit
  • Click to share on Facebook (Opens in new window) Facebook
  • Click to share on LinkedIn (Opens in new window) LinkedIn
  • Click to share on WhatsApp (Opens in new window) WhatsApp
  • Click to print (Opens in new window) Print

Citrix XEN to Linux KVM

  • 9th June 20172nd January 2019
  • by Frederick Mbuya

Although I have been using KVM, (Linux Based Kernel Virtualization), on my own infrastructure for some time, it is only recently, (past 4 years), that I have comfortable enough with it to use it in production for my clients, especially when other admins would be involved with administration of the systems. A choice I often made was to use Citrix XEN, as it…

  • Has a commercially supported version should my client decide to stop using me.
  • Has a fully free and open version which for the most part is not cobbled, at least it has all the core functionality that my clients need.

One thing that always frustrated me about it was the fact I needed to use a Windows machine to use access its management interface, yes there is a commmunity driven linux version but never worked quite well.

Anyway one of the clients where I had used Citrix XEN, is now ready for an upgrade and I found myself in a situation where I needed to move virtual machines from XEN to KVM. It took quite a bit of digging and experimenting but this is what I came up with.

First of all you need to gather three pieces of info:

  1. HOST-UUID
    [root@xenhost ~]# xe host-list
    uuid ( RO)                : 181d1fa8-ef74-43ed-ac51-67c80717f6f0
              name-label ( RW): uhu-xen-01
        name-description ( RW): Uhurulabs XEN Server
    
    
  2. NETWORK-UUID
    uuid ( RO)                : 52ce1f9d-faeb-f1e8-7054-505ac2ace647
              name-label ( RW): Pool-wide network associated with eth1
        name-description ( RW): 
                  bridge ( RO): xenbr1
    
    
    uuid ( RO)                : 2014eeb6-6f09-d3bf-88e4-1a0978ee8df7
              name-label ( RW): Host internal management network
        name-description ( RW): Network on which guests will be assigned a private link-local IP address which can be used to talk XenAPI
                  bridge ( RO): xenapi
    
    
    uuid ( RO)                : eb57b4d8-7656-c607-b1cc-93cfd8766afe
              name-label ( RW): Pool-wide network associated with eth0
        name-description ( RW): 
                  bridge ( RO): xenbr0
  3. VDI-UUID
    uuid ( RO)                : fd6a4413-16ed-45ac-b206-c1587105ffb9
              name-label ( RW): windows_srv_2008_std.iso
        name-description ( RW): 
                 sr-uuid ( RO): 05f8b9fd-438c-42ed-9da0-56c24c9ad13e
            virtual-size ( RO): 3166896128
                sharable ( RO): false
               read-only ( RO): true
    
    
    uuid ( RO)                : 5c9595df-f6d9-4a17-8de9-e53a611308f0
              name-label ( RW): CentOS-6.0-x86_64-bin-DVD2.iso
        name-description ( RW): 
                 sr-uuid ( RO): 05f8b9fd-438c-42ed-9da0-56c24c9ad13e
            virtual-size ( RO): 1182699520
                sharable ( RO): false
               read-only ( RO): true
    
    
    uuid ( RO)                : 2713e717-8847-4d1c-abb8-4725a0ce1d88
              name-label ( RW): THIS IS THE MACHINE WE WANT TO EXPORT
        name-description ( RW): Created by template provisioner
                 sr-uuid ( RO): 4f02d7e3-67b6-3d14-9533-ebfb4fa323f8
            virtual-size ( RO): 53687091200
                sharable ( RO): false
               read-only ( RO): false

So now we know that

  1. HOST_UUID is 181d1fa8-ef74-43ed-ac51-67c80717f6f0
  2. VDI-UUID is 2713e717-8847-4d1c-abb8-4725a0ce1d88
  3. NETWORK-UUID is eb57b4d8-7656-c607-b1cc-93cfd8766afe

We now need to tell the XEN Server to put the machine in “transfer” mode, NOTE that this will make the machine unavailable for the duration of the export process.

xe host-call-plugin host-uuid=181d1fa8-ef74-43ed-ac51-67c80717f6f0 \
fn=expose args:vdi_uuid=2713e717-8847-4d1c-abb8-4725a0ce1d88 \
args:network_uuid=eb57b4d8-7656-c607-b1cc-93cfd8766afe \
args:transfer_mode=http plugin=transfer

As we are going to do the transfer over HTTP we need to find out the correct URL to access

xe host-call-plugin host-uuid=181d1fa8-ef74-43ed-ac51-67c80717f6f0 \
plugin=transfer fn=get_record args:record_handle=af2eed01-b162-b37a-982c-4536dd2b5bc3

?xml version="1.0"?>
<transfer_record username="47b5ebf70160c752" 
status="exposed" url_path="/vdi_uuid_2713e717-8847-4d1c-abb8-4725a0ce1d88" 
record_handle="af2eed01-b162-b37a-982c-4536dd2b5bc3" 
device_2713e717-8847-4d1c-abb8-4725a0ce1d88="xvdb" 
url_full_2713e717-8847-4d1c-abb8-4725a0ce1d88="http://47b5ebf70160c752:f2b25a5854c12b76@192.168.13.88:80/vdi_uuid_2713e717-8847-4d1c-abb8-4725a0ce1d88" 
ip="192.168.13.88" transfer_mode="http" url_path_2713e717-8847-4d1c-abb8-4725a0ce1d88="/vdi_uuid_2713e717-8847-4d1c-abb8-4725a0ce1d88" 
all_devices="xvdb" 
url_full="http://47b5ebf70160c752:f2b25a5854c12b76@192.168.13.88:80/vdi_uuid_2713e717-8847-4d1c-abb8-4725a0ce1d88" 
device="xvdb" use_ssl="false" password="f2b25a5854c12b76" 
port="80" vdi_uuid="2713e717-8847-4d1c-abb8-4725a0ce1d88">
</transfer_record>

Now we have the URL: http://47b5ebf70160c752:f2b25a5854c12b76@192.168.13.88:80/vdi_uuid_2713e717-8847-4d1c-abb8-4725a0ce1d88, we can now use curl to get the image

curl http://47b5ebf70160c752:f2b25a5854c12b76@192.168.13.88:80/vdi_uuid_2713e717-8847-4d1c-abb8-4725a0ce1d88 -o machine_raw_disk.raw

You can now import machine_raw_disk.raw as a new image into KVM!

Share this:

  • Click to share on X (Opens in new window) X
  • Click to share on Telegram (Opens in new window) Telegram
  • Click to share on Pocket (Opens in new window) Pocket
  • Click to share on Pinterest (Opens in new window) Pinterest
  • Click to share on Tumblr (Opens in new window) Tumblr
  • Click to share on Reddit (Opens in new window) Reddit
  • Click to share on Facebook (Opens in new window) Facebook
  • Click to share on LinkedIn (Opens in new window) LinkedIn
  • Click to share on WhatsApp (Opens in new window) WhatsApp
  • Click to print (Opens in new window) Print

How safe do you feel?

  • 16th May 20172nd January 2019
  • by Frederick Mbuya

I have been noticing with more and more concern the increasing amount of what looks to me as VERY young police officers on the streets carry what look to me like AF-47, (I don’t know much about guns!!!). When I have seen what feel to me like “gangs” of them on street corners I have honestly asked myself if I feel more or less comfortable. Fact is they make me nervous as hell, and part of me has been comforted by the fact that I figured that probably most of the guns are none functional or not loaded. I came across a video yesterday that confirmed that I should be very afraid, AND that my hope that the guns are not loaded or not working was crushed.. indeed they are loaded, they do work, and unfortunately looks like the police are somewhat lacking in their training.

 

Share this:

  • Click to share on X (Opens in new window) X
  • Click to share on Telegram (Opens in new window) Telegram
  • Click to share on Pocket (Opens in new window) Pocket
  • Click to share on Pinterest (Opens in new window) Pinterest
  • Click to share on Tumblr (Opens in new window) Tumblr
  • Click to share on Reddit (Opens in new window) Reddit
  • Click to share on Facebook (Opens in new window) Facebook
  • Click to share on LinkedIn (Opens in new window) LinkedIn
  • Click to share on WhatsApp (Opens in new window) WhatsApp
  • Click to print (Opens in new window) Print

Trying to merge lots of big GeoTIFF’s

  • 4th April 20172nd January 2019
  • by Frederick Mbuya

The idea was to merge into 1 geotiff 28 irregular shaped geotiffs which ranged in size from 1g up to 13g and making up a total of 113g. Why? Because I then needed to cut the resulting geotiff into multiple irregular shaped individual geotiff’s. This took a couple of days, and I was quite keen to come home today as this morning it was at 80%, you can only imagine my disappointment when I checked and found…

0...10...20...30...40...50...60...70...80...90..Traceback (most recent call last):
  File "/usr/bin/gdal_merge.py", line 540, in <module>
    sys.exit(main())
  File "/usr/bin/gdal_merge.py", line 526, in main
    fi.copy_into( t_fh, band, band, nodata )
  File "/usr/bin/gdal_merge.py", line 270, in copy_into
    nodata_arg )
File "/usr/bin/gdal_merge.py", line 63, in raster_copy
    nodata )
  File "/usr/bin/gdal_merge.py", line 105, in raster_copy_with_nodata
    nodata_test = Numeric.equal(data_src,nodata)
MemoryError

This as you can imagine was very frustrating, and I just assumed that the output was junk but figured what the hell, its created a 301G file lets see what it is. Started the process to load into Qgis, and since I figured it would take a while started this blog post. It seems though that output might be useful as qgis eventually loaded the file, and it actually looks like what I expected. So first things first I have set qgis to now save the file under a new name… Its going to take a while, currently at 12.2G and I actually expect it to end up bigger than the initial 301G

… a day or so later …

So it turns out the merge worked, I will revisit why the error later, but the file saved by qgis had the exact same size, and has the same result using gdalinfo. AND i have just done my first clip BUT  it seems I made a mistake the command I used was

gdalwarp -dstnodata 0 \
-q -cutline shape_file_to_clip_to.shp -tr 0.05806 0.05806 \
-of GTiff input_file.tif \
clipped_file.tif

Which resulted in a file the same size of the input file, and the same dimentions, what I should have done, I think, is:

gdalwarp -dstnodata 0 \
-q -cutline shape_file_to_clip_to.shp -tr 0.05806 0.05806 \
-of GTiff input_file.tif \
clipped_file.tif

Which is what I am running now, the previous command took about 3 hours!

Share this:

  • Click to share on X (Opens in new window) X
  • Click to share on Telegram (Opens in new window) Telegram
  • Click to share on Pocket (Opens in new window) Pocket
  • Click to share on Pinterest (Opens in new window) Pinterest
  • Click to share on Tumblr (Opens in new window) Tumblr
  • Click to share on Reddit (Opens in new window) Reddit
  • Click to share on Facebook (Opens in new window) Facebook
  • Click to share on LinkedIn (Opens in new window) LinkedIn
  • Click to share on WhatsApp (Opens in new window) WhatsApp
  • Click to print (Opens in new window) Print

GeoTIFF Transparency

  • 31st March 20172nd January 2019
  • by Frederick Mbuya

I have spent way too much time messing with this, and certainly too many CPU cycles processing only to get the wrong output. The problem…

“Two irregular shaped geotiffs that you need to merge into one and not have any stupid gaps”

My first attempts resulted in things like…

 


 

 

 

 

Not the desired output, but the fix was easy really, first fix the nodata value with

gdal_translate -a_nodata 0 \
-of GTiff \
file1.tif \
file1_nodata0.tif

gdal_translate -a_nodata 0 \
-of GTiff \
file2.tif \
file2_nodata0.tif

Then merge the two

gdal_merge.py -n 0 \
-a_nodata 0 \
file1_nodata0.tif file2_nodata0.tif \
-o merged_file

That results in a nicely merged GeoTIFF

Share this:

  • Click to share on X (Opens in new window) X
  • Click to share on Telegram (Opens in new window) Telegram
  • Click to share on Pocket (Opens in new window) Pocket
  • Click to share on Pinterest (Opens in new window) Pinterest
  • Click to share on Tumblr (Opens in new window) Tumblr
  • Click to share on Reddit (Opens in new window) Reddit
  • Click to share on Facebook (Opens in new window) Facebook
  • Click to share on LinkedIn (Opens in new window) LinkedIn
  • Click to share on WhatsApp (Opens in new window) WhatsApp
  • Click to print (Opens in new window) Print

Working with GeoTIFF’s from Pix4D

  • 28th March 20172nd January 2019
  • by Frederick Mbuya

We have been doing a lot of work with drones over the past couple of years. Much of it proof of concept for the use of small, under 1Kg drones to capture aerial imagery as alternative to both manned aircraft and satellite, the premise being deploying small drones is much easier and cheaper than the alternative and that the acquired data would be of equal if not superior quality. For the most part we have used senseFly ebee drones, and done our post processing work in Pix4D Mapper, both being professional grade survey hardware and software. There are many possible outputs, but one of the main ones is a Orthophoto, according to Wikipedia

An orthophoto, orthophotograph or orthoimage is an aerial photograph or image geometrically corrected (“orthorectified”) such that the scale is uniform: the photo has the same lack of distortion as a map. Unlike an uncorrected aerial photograph, an orthophotograph can be used to measure true distances, because it is an accurate representation of the Earth’s surface, having been adjusted for topographic relief,[1] lens distortion, and camera tilt.

The format of the orthophoto as output from Pix4D is GeoTiff and using gdalinfo we can extract the following info:

Driver: GTiff/GeoTIFF
Files: oysterbay.tif
Size is 34565, 37051
Coordinate System is:
PROJCS["WGS 84 / UTM zone 37S",
    GEOGCS["WGS 84",
        DATUM["WGS_1984",
            SPHEROID["WGS 84",6378137,298.257223563,
                AUTHORITY["EPSG","7030"]],
            AUTHORITY["EPSG","6326"]],
        PRIMEM["Greenwich",0,
            AUTHORITY["EPSG","8901"]],
        UNIT["degree",0.0174532925199433,
            AUTHORITY["EPSG","9122"]],
        AUTHORITY["EPSG","4326"]],
    PROJECTION["Transverse_Mercator"],
    PARAMETER["latitude_of_origin",0],
    PARAMETER["central_meridian",39],
    PARAMETER["scale_factor",0.9996],
    PARAMETER["false_easting",500000],
    PARAMETER["false_northing",10000000],
    UNIT["metre",1,
        AUTHORITY["EPSG","9001"]],
    AXIS["Easting",EAST],
    AXIS["Northing",NORTH],
    AUTHORITY["EPSG","32737"]]
Origin = (529846.625330000068061,9252232.345700001344085)
Pixel Size = (0.049060000000000,-0.049060000000000)
Metadata:
  AREA_OR_POINT=Area
  TIFFTAG_SOFTWARE=pix4dmapper
Image Structure Metadata:
  COMPRESSION=LZW
  INTERLEAVE=PIXEL
Corner Coordinates:
Upper Left  (  529846.625, 9252232.346) ( 39d16'12.33"E,  6d45'53.64"S)
Lower Left  (  529846.625, 9250414.624) ( 39d16'12.36"E,  6d46'52.83"S)
Upper Right (  531542.384, 9252232.346) ( 39d17' 7.57"E,  6d45'53.60"S)
Lower Right (  531542.384, 9250414.624) ( 39d17' 7.61"E,  6d46'52.80"S)
Center      (  530694.505, 9251323.485) ( 39d16'39.97"E,  6d46'23.22"S)
Band 1 Block=34565x1 Type=Byte, ColorInterp=Red
  NoData Value=-10000
Band 2 Block=34565x1 Type=Byte, ColorInterp=Green
  NoData Value=-10000
Band 3 Block=34565x1 Type=Byte, ColorInterp=Blue
  NoData Value=-10000
Band 4 Block=34565x1 Type=Byte, ColorInterp=Alpha
  NoData Value=-10000

This is for a sample 2 Gigabyte  GeoTIFF. Now this might not sound too big but this is for an area of about 1.5sq/km. As you can imagine when you’re doing projects that are in the 100’s if not 1000’s of sq/km you will quickly need huge amounts of storage space. And that is not the only problem, as far as I know the most popular open source platform for hosting such data is currently a combination of geonode and geoserver, however after some experimentation I found that serving these images as is was simply not an option it was taking the geonode too long to render them, resulting in an awful user experience.

I looked into many options, and one of the best seemed to be mbtiles, however in the end it was not a suitable .. blog for another day…

After much research online it seemed like the best option was to first compress the GeoTIFF with gdal using gdal_translate. So far the best options I have been able to find are:

“-b 1 -b 2 -b 3”, which specifies that we only want the first three bands, if you refer to the gdalinfo output above you will see that for some reason Pix4D includes an alpha band. We don’t need it, I think! and its critical to only have the three bands or the PHOTOMETRIC flag won’t work. The next option is

“-a_nodata”, which sets the nodata, (transparency), value to 0 allowing the generated file to have transparency.

COMPRESS=JPEG is quite self-explanatory this is the instruction to gdal to use JPEG compression. This is better than the

PHOTOMETRIC=YCBCR, this changes the used color space from RGB to YCbCr, from what I have been able to find out. (thanks google), the eye is more sensitive to changes in luminance (Y, brightness) than to changes in chroma (Cb, Cr, color). Thus, it is possible to erase some chroma information while retaining image quality, thus allowing for better compression without any visible loss in image quality.

TILED=Yes again is quite self explanatory, it stores the image data in a tiled format which makes for a much better user experience when viewing the data using something like geonode or QGIS.

time gdal_translate \
-b 1 -b 2 -b 3 \
-a_nodata 0 \
-co COMPRESS=JPEG \
-co PHOTOMETRIC=YCBCR \
-co TILED=YES \
sample.tif \
sample_JPEG_YCBCR

After this the we have two files

151M Mar 28 17:54 oysterbay_JPEG_YCBCR.tif
2.0G Mar  2 07:49 oysterbay.tif

As you can see this has reduced our file from 2Gto 151M this is a 13 fold reduction is size, it took my workstation 39 seconds to do the compression, you can see details of my workstation here.

The next step is to add some zoom scales to the image, this will result in a slight increase in the file size but will result in a much better user experience when zooming into and out of the image.

time gdaladdo \
--config COMPRESS_OVERVIEW JPEG \
--config PHOTOMETRIC_OVERVIEW YCBCR \
--config INTERLEAVE_OVERVIEW PIXEL \
-r average oysterbay_JPEG_YCBCR.tif 2 4 8 16

This on my workstation took 28 seconds and added 50M to the file resulting in

220M Mar 28 18:29 oysterbay_JPEG_YCBCR.tif

And if we again check the file info using gdalinfo we get

Driver: GTiff/GeoTIFF
Files: oysterbay_JPEG_YCBCR.tif
Size is 34565, 37051
Coordinate System is:
PROJCS["WGS 84 / UTM zone 37S",
    GEOGCS["WGS 84",
        DATUM["WGS_1984",
            SPHEROID["WGS 84",6378137,298.257223563,
                AUTHORITY["EPSG","7030"]],
            AUTHORITY["EPSG","6326"]],
        PRIMEM["Greenwich",0,
            AUTHORITY["EPSG","8901"]],
        UNIT["degree",0.0174532925199433,
            AUTHORITY["EPSG","9122"]],
        AUTHORITY["EPSG","4326"]],
    PROJECTION["Transverse_Mercator"],
    PARAMETER["latitude_of_origin",0],
    PARAMETER["central_meridian",39],
    PARAMETER["scale_factor",0.9996],
    PARAMETER["false_easting",500000],
    PARAMETER["false_northing",10000000],
    UNIT["metre",1,
        AUTHORITY["EPSG","9001"]],
    AXIS["Easting",EAST],
    AXIS["Northing",NORTH],
    AUTHORITY["EPSG","32737"]]
Origin = (529846.625330000068061,9252232.345700001344085)
Pixel Size = (0.049060000000000,-0.049060000000000)
Metadata:
  AREA_OR_POINT=Area
  TIFFTAG_SOFTWARE=pix4dmapper
Image Structure Metadata:
  COMPRESSION=YCbCr JPEG
  INTERLEAVE=PIXEL
  SOURCE_COLOR_SPACE=YCbCr
Corner Coordinates:
Upper Left  (  529846.625, 9252232.346) ( 39d16'12.33"E,  6d45'53.64"S)
Lower Left  (  529846.625, 9250414.624) ( 39d16'12.36"E,  6d46'52.83"S)
Upper Right (  531542.384, 9252232.346) ( 39d17' 7.57"E,  6d45'53.60"S)
Lower Right (  531542.384, 9250414.624) ( 39d17' 7.61"E,  6d46'52.80"S)
Center      (  530694.505, 9251323.485) ( 39d16'39.97"E,  6d46'23.22"S)
Band 1 Block=256x256 Type=Byte, ColorInterp=Red
  NoData Value=0
  Overviews: 17283x18526, 8642x9263, 4321x4632, 2161x2316
Band 2 Block=256x256 Type=Byte, ColorInterp=Green
  NoData Value=0
  Overviews: 17283x18526, 8642x9263, 4321x4632, 2161x2316
Band 3 Block=256x256 Type=Byte, ColorInterp=Blue
  NoData Value=0
  Overviews: 17283x18526, 8642x9263, 4321x4632, 2161x2316

Note the GeoTIIFF is now compressed using YCbCr JPEG, and uses the YCbCr color space.
Bellow on the left is the section of the original Orthophoto and the right the compressed version, can you see a difference?

Here a section zoomed in even closer, 5 points to anyone who know’s what you’re looking at!

Share this:

  • Click to share on X (Opens in new window) X
  • Click to share on Telegram (Opens in new window) Telegram
  • Click to share on Pocket (Opens in new window) Pocket
  • Click to share on Pinterest (Opens in new window) Pinterest
  • Click to share on Tumblr (Opens in new window) Tumblr
  • Click to share on Reddit (Opens in new window) Reddit
  • Click to share on Facebook (Opens in new window) Facebook
  • Click to share on LinkedIn (Opens in new window) LinkedIn
  • Click to share on WhatsApp (Opens in new window) WhatsApp
  • Click to print (Opens in new window) Print

Posts pagination

1 2

Recent Posts

  • Some OpenData
  • Gymkhana
  • The Zanzibar Mapping Initiative
  • Why I am so disapointed in the Black Panther Movie
  • Working on the edge…

Recent Comments

  • Frederick Mbuya on GeoTIFF Transparency
  • Frederick Mbuya on GeoTIFF Transparency

Archives

  • January 2019
  • October 2018
  • March 2018
  • February 2018
  • September 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017

Categories

  • Blog
  • GeoTIFF
  • GIS
  • UAV Related
  • Uncategorised
  • Virtualization

Meta

  • Log in
  • Entries feed
  • Comments feed
  • WordPress.org

Follow me on Twitter

My Tweets
Theme by Colorlib Powered by WordPress
 

Loading Comments...