Development Seed

Subscribe to Development Seed feed
Updated: 1 week 22 hours ago

A Home for Open Housing Data

Mon, 11/17/2014 - 12:00

Categories:

Drupal

The housing crisis deeply impacted millions of Americans, and today the effects are still being felt. For the many problems facing communities, from wage disparity to affordable housing, there isn't a single fits-all solution.

That's why we are very excited about the work that Woodstock Foundation is doing to support fair housing policies. Today they launch a nuanced look at housing and income disparity in Illinois in the form of a new map-based open data website.

The site brings together 74 datasets on the well-being of local communities. It is a good roadmap for anyone working in housing justice in Illinois. Community organizations can explore the average amount of mortgage debt people take on and the rate of foreclosure filings in the Chicago six-county region and elsewhere in Illinois to inform their decisions on where to focus their work. Some data sets go back to 2008.

Where the highest income census tracts are in the Chicago six-county region

Serving complex data through static JSON

Housing data is complex, and Woodstock has gathered some amazingly granular statistics about housing data in Illinois. Splitting this data into a format that we could serve over the web proved a difficult challenge. Woodstock is also a small nonprofit, and we wanted to ease as much as possible the burden of maintaining a complicated website.

So we wrote a python library to break up their spreadsheets into JSON. Every time you switch to a new facet of housing data, every time you view a different year or category of that data, your browser incrementally downloads a new JSON file. Although those files in aggregate would take ages to load, individually they are manageable. Those scripts, along with the rest of the site, are open-source.

To further reduce load times, we use the topojson spec to reduce the size of geographic boundary data. This allows us to separate geographic data from numerical data, so you only download those complex census tract boundaries once. The code that runs in your browser than re-connects those boundaries to the housing or mortgage data you select on the fly.

Using vector-based geographical boundaries has other benefits. It allowed us to use a mouse click on an overlaying geographic boundary and a point-in-polygon test to find, for example, which congressional representative is responsible for which census tract.

Detecting congressional boundaries

Census tracts over multiple years

Census tracts change a lot over census years, and this can be a problem when mapping a multi-year data set that covers more than a single census geography file. Attempting to compare the data between those years that use different tract definitions can be tricky. Tract ID, or FIPS codes can either refer to a different neighborhood or disappear entirely.

The Census releases relationship files that show where these changes and additions occur. Using this, we created a tool that overlays 2000 and 2010 census tracts, and shows differences between the two years.

Currently the project covers only Illinois, but the code is available on GitHub and ready for you to fork and contribute your own state.

Join us at EcoHack this weekend

Thu, 11/13/2014 - 14:00

Categories:

Drupal

Toxic algal bloom in Lake Erie

To address climate change and promote environmental justice, we need better tools to understand our changing planet. That's why we are delighted to help host the DC EcoHack with WRI at the Mapbox Garage.

EcoHack is an event to bring together a diverse community of scientists, hackers, designers and others who want to tell stories and create tools to protect our environment. The event is open to people of all skill levels. As long you're interested in using technology to improve and better understand our natural environment, we'd love to see you there.

If you're in DC, register and come join us at the MapBox Garage in DC. Our friends at Mapbox are also hosting the San Francisco EcoHack and EcoHacks will also take place in Sydney, Cambridge, New York, and Madrid.

Drew and Marc will be using some of the time to work on landsat-util, an open source tool that makes it easier to work with open satellite imagery. We are also keen to help on projects using green energy investment data and tracking natural disasters.

Hope to see you there!

International Conference of Crisis Mappers storms NYC

Thu, 11/06/2014 - 14:00

Categories:

Drupal

Satellite imagery over Monrovia, an area that has seen a significant burden of Ebola cases.

Today, Marc and I are heading up to New York to attend the International Conference of Crisis Mappers. Open crisis mapping is growing up. We are seeing greater demand for maps and data for crisis response and preparation. ICCM 2014 will be an important place to discuss how we can grow to handle this need, how we better generate real collaboration from data, and how we build infrastructure that is usable and inclusive.

We're looking forward to talking about OpenStreetMap, satellites, and open data; topics that are critical in the midst of the response to the Ebola outbreak in West Africa. Reach out to @nas_smith or @kamicut on Twitter if you want to chat.

Getting to vote

Tue, 11/04/2014 - 14:00

Categories:

Drupal
vit.load({officialOnly:false});

Working embed. US voters, find your polling location

In the US, and around the world, it can be confusing figuring out where you should vote and which races you are eligible to vote on. In the US, going to the wrong polling station is a hassle. This hassle can be prohibitive, particularly if you are disabled or rely on public transportation. In countries coming out of conflict, going to the polling station can be a brave and risky act. You'd better be at the right place when you get there.

Open data is helping to get voters the information they need to participate. By opening up data about voting locations and process, States can involve private actors, such as The Pew Trusts and Google, as partners providing accurate information about where to vote. The VIP (Voter Information Project) is an embeddable tool, based on an open source stack, that relies on an open API. This allows other groups to repackage and distribute this information to their audiences.

By opening up the data, States are no longer solely responsible for getting voters to the right polling place. Voters should be tripping over this information in every Google search, Foursquare check-in, community message board, and favorite blogs.

"Usability and Voting" by ericgundersen

Hey there, designers.

Sun, 11/02/2014 - 10:00

Categories:

Drupal

At Development Seed, design is not about pushing pixels or passing a perfect mockup to the next person. It is about truly understanding — and sometimes defining — a problem, working out a systemic solution with visual and interactive components. You will be brainstorming solutions with our strategists and turning them into sketches, websites, and data visualizations with fellow developers.

We build new ways to help people make decisions — impacting policies and creating transparency on all levels. We are hiring a designer who is a doer and a thinker, eager to join our mission.

You are:
  • an artist; you have a favorite medium to express your ideas, be it ink, paint, vector or gif
  • excited about the web, particularly how the representation of information on screens can inform people’s decisions
  • eager to work with data and the patterns it leads to
  • curious and hungry to learn new subjects and skills
You know how to:
  • ask good questions and get to the heart of a problem
  • illustrate abstract concepts and workflows in visual forms
  • use the right font and color at the right time, knowing that aesthetics is derived from your communication goal, the information, and the medium
Experiences with any of the following will be a plus:
  • Web maps (such as a map made with MapBox Studio)
  • Responsive web frameworks
  • Git
  • D3.js for visualization (bar chart counts)
  • Static site generation (Jekyll, Flask, etc)
  • SASS

But don't let any of that scare you. If your design chops are good, we will work with you to tech up on everything you need to know.

To apply:

Please send your portfolio site, and links to three projects, to jobs@developmentseed.org with “Designer” in your subject.

Dauria Geo completing new design specs

Thu, 10/30/2014 - 12:00

Categories:

Drupal

Dauria Geo is just completing new design specs for their Perseus satellite constellations. Perseus-O, a constellation of 8 satellites, will provide daily global coverage of all arable land at 22 meters resolution (meaning each pixel represents 22 meters on the ground). With the same spectral bands as Landsat this new imagery will be able to measure crop health and flooding. The Perseus-HD constellation of 20 satellites will provide daily images of all urban and arable land at 2.5 meters resolution -- showing roads, buildings, ships, and fields.

In addition to having their own satellites, Dauria Geo will make Landsat and MODIS open data sources accessible through their API. They have established partnerships with industry leaders like Deimos in Spain, EIAST in Dubai, and Eye Innovation in China to provide a variety of resolution, coverage, sensor, and freshness of imagery, offering a unique balance between resolution and timely revisit. This is super exciting for our team as we work to expand where we source imagery for NGOs to process in their pipelines.

The new technical specs mean Dauria Geo moves into the next build phase, and is on target to begin launching the Perseus constellation in 2015. We're collaborating with Dauria Geo now as they build out integration and visualization tools -- from antenna to API, that makes image acquisition, analysis, delivery, and integration easier for both NGOs and enterprise. Dauria Geo is building an API to empower developers to access fresh and historical imagery, compute needed data layers on the cloud, and harvest data in ready-to-use format. Their cloud platform can do heavy analysis and feed data directly into applications. By directly integrating with the Mapbox API, we can quickly deploy sophisticated and beautiful applications from agriculture to disaster response using the platforms that developers are already building on.

We'd love to see more satellite providers compete on ease of integrating their data. We'll be helping Dauria Geo to review their API to make it developer friendly and we will build open source tools on top Dauria's API. These tools will serve as open templates for integrating Dauria with tools like Mapbox to quickly build powerful, data-rich sites. This is really positive move for the industry and for users.

Marhaba Marc Farra

Tue, 10/28/2014 - 15:00

Categories:

Drupal

Marc Farra has joined the Development Seed team. Marc loves to experiment with image processing, arduino sensors, and data infrastructure. He is going to help us explore new ways to collect and process data.

I first ran into Marc in Beirut. At the time he was running Lamba Labs, a hacker space in Beirut that was sowing the seeds of Maker culture and open data advocacy in Lebanon. A year later, he took the Afghanistan polling station locations we posted on Github and started to build a mobile app for Afghans to locate their nearest polling station.

Say "Hello", "Salut", or "Marhaba" to Marc on Twitter and Github.

Getting Green into Green Energy

Tue, 10/28/2014 - 13:00

Categories:

Drupal

Reversing climate change means investing in green energy, and as the sustainable sector grows, ensuring it grows in both developing and developed countries. Today the Fondo Multilateral de Inversiones and Bloomberg New Energy Finance are launching a vastly expanded Climatescope, to provide open data about green energy investment in 55 countries. The data provided by Climatescope creates an information-rich environment for green energy investors. It also provides valuable data on clean energy policies for activists and policymakers.

Opening Climate Investment Data

We worked on the Climatescope website with Flipside, a smart, new open source technology shop based in Lisbon.

The site takes a very thoughtful approach to opening information. All the data powering the Climatescope site is available through an open API, which you can easily integrate into your own applications. The full dataset is also available for analysis. On almost every page lives a download button that provides a CSV file containing whatever you happen to be viewing.

Most importantly, FOMIN got the licensing right. The data is licensed CC-BY. It can be used (with attribution) by anyone, even for commercial purposes. This is critical when you want data to encourage commercial activity. Moreover the website itself is also open and is licensed GPL 3.0. The entire site can be forked by other open source projects.

Dynamic Static Websites

Like many of the sites we (and Flipside) build these days, Climatescope is a fully interactive site without a database or a heavy CMS. Climatescope users can manipulate, interrogate, and download the data on any device and in low bandwidth requirements. The site uses Jekyll, Angular, and D3 (among other tools) and is hosted on Github. Read more on our approach to CMS-free websites.

Customized weighting

People have different priorities when evaluating the environment for clean energy. The site is designed for a range of users, from activists to journalists, politicians, environmentalists, and the curious. FOMIN is committed to giving Climatescope users full control over how much weight each metric carries. To accomodate this, we built simple, intuitive sliders. Movement in one slider spreads the difference evenly across the other three factors. You can lock any slider to make it easier to hit an exact breakdown.

Hacking for the Planet

Have some data or coding skills? Care about the planet? Consider joining an EcoHack near you on Nov 15-16. We are hosting the DC EcoHack with WRI. EcoHacks are also happening in Sydney, Cambridge, New York, Madrid, and San Francisco.

Howdy Dan McCarey

Mon, 10/27/2014 - 17:00

Categories:

Drupal

Dan McCarey has joined Development Seed. Dan is going to help us to turn complex data into compelling stories. Dan is an information designer and web developer. He builds powerful websites. Dan is passionate about leaving the world better than he found it. That passion has drawn him to live and work in Nepal and Sudan.

Dan created the interactive "Mapping Cholera: A Tale of Two Cities", which recently appeared in Scientific American. This stunning interactive uses historical data and maps to track the spread of cholera in New York in 1832 and compares that to the spread of cholera in Haiti today.

Follow Dan on Twitter and Github.

Giving context to open spending data

Thu, 10/02/2014 - 15:00

Categories:

Drupal

Yesterday the government of Mexico launched datos.gob.mx to open up government data across all ministries. We built a mapping tool for ministries to quickly build rich maps from data on the site. The tool also makes it easier to combine government data with other open datasets. This provides context and meaning to complex government data.

The first dataset that we mapped was all 2013 funds for disaster response and reconstruction. The map plots thousands of reconstruction projects across 45 natural disasters, including Hurricane Manual and Ingrid which affected two-thirds of Mexico, killing 192 people and causing $75 billion pesos in damage.

This is an incredibily rich and complex dataset. But this data alone is not particularly helpful. We need to better understand the context to understand why government invested funds the way that it did, evaluate the effectiveness of these investments, and plan for future events.

To help understand the context we pulled open satellite data from the days following Hurricane Ingrid to better understand the extend of flooding caused by the Hurricane. We used landsat-util to download the data, and produced a false color composite to highlight water.

The resulting map shows reconstruction projects in the context of the flooding.

View the government of Mexico's map building tool on github. We used landsat-util to get the imagery and processed it with this script.

Thanks for Having Us, ONA

Thu, 10/02/2014 - 11:00

Categories:

Drupal

This past weekend Jue and I presented at the Online News Association Conference in Chicago. We shared what we've learned from creating mapping sites that toggle through a high volume of data: specifically, how we mapped tens of thousands of rows of data on a series of maps using a mostly-front-end stack.

We built this data browser with Backbone and Leaflet. Given the amount of data, we incorportated data binding and drawing methods from D3.js on top of Leaflet to boost the loading and interaction speed of the site. By writing python code to output configuration and data json files, we made the site modular. We customized all the geography files into efficient topojson -- some with necessary simplications to control file size, and others with customized data field to be joined with the rest of datasets in the front-end. From this experience we learned:

  • The good: we made a site that is hosted on Github. Even though the site handles tons of data, it will rarely go down. Topojson makes geography in the browser faster than ever.
  • The crazy: we spent a lot of brain cells preparing the data and its metadata in python, in order to unload the javascript from heavy calculations.

Like other problems we encounter at Development Seed, we try to tailor our approach using the most fitting technology. We are constantly assessing our tech stacks, which prepares us for challenges that come in all shapes and sizes. Although we've managed the complexity of this project well, we are also excited about geographic databases such as PostGIS, which we may plug in the next time we come across a project of this size.

I also did a hands-on session about designing maps using d3 and topojson. We provided several examples to help those starting to learn this workflow. You can find the talk here. We've found this workflow to be valuable, since it allows us to serve complex sites as static files. This is consistent with our approach to building interactive CMS-Free sites.

We hope you find our presentations useful. Let us know if you have questions or comments.

Major Open Data Push by the Mexican Government

Wed, 10/01/2014 - 15:00

Categories:

Drupal

The Mexican Government is investing heavily in open data to directly make government more effective and the country more productive. Today, kicking off the regonal open data gathering in Latin America - ConDatos, the Government of Mexico presented datos.gob.mx a massive data portal with open public data from across the Government.

Data must be accessible to be useful in driving innovation and participation. Datos.gob.mx addresses accesibility in two ways. First, all data is machine-readable and searchable, and so is the metadata about those datasets. A CKAN data portal provides data in bulk download and via an API. Second, Datos.gob.mx put a heavy emphasis on stories and tools that turn raw data into insight. Storytelling tools make the data immediately accessible and understandable to both citizens and policymakers.

Mapping open data

We worked with the Office of the President of Mexico to build a mapping tool that integrates directly with datos.gob.mx to provide rich storytelling ability. The President's Office worked with the Civil Protection Service to map all 2013 funds for disaster response and reconstruction. The map plots thousands of reconstruction projects across 45 natural disasters, including Hurricane Manual and Ingrid which affected two-thirds of Mexico, killing 192 people and causing $75 billion pesos in damage.

The mapping tool allows ministries to quickly stand up a rich interactive map off of any dataset on datos.gob.mx through a single page of markdown. The map generation tool anticipates many of the way in which ministires will want to aggregate and display information, while also making it easy for advanced users to develop more sophisticated visualizations.

We leaned on Jekyll for the map templating ability and mapbox for base layers. Datasets are pulled in over the CKAN API and rendered in real time. All the code for the map generation tool is open source, on github, and available to other governments interested in mapping open data.

Development in the Time of Climate Change

Fri, 09/19/2014 - 14:00

Categories:

Drupal

International development is getting harder. Climate change, population strains, and conflict over resources threaten to undo many of the gains made toward the Millenium Development goals. Doing development right means looking outside of country and sector silos and looking to the bigger picture. Today Secretary Kerry announced the launch of the Global Resilience Partnership, a new partnership to address climate and population change through more coordinated and smarter action.

Achieving the vision of the Global Resilience Partnership will require fluid, fast, and open information that supports coordination and decsionmaking. We are proud to be working with the Global Resilience Partnership on building a data and technology infrastructure to support new ways of addressing global stresses and shocks. Whether it is building tools that connect food security workers with conflict mitigation experts, analyzing and opening satellite imagery after a flood, or helping municipal governments analyze complex data sets, Development Seed is excited to be part of a powerful approach to solving global challenges, forged on openess and collaboration.

The first part this effort will be a global collaborative design challenge. Check out the site for more information on the Global Resilience Challenge. If you prefer viewing the source, you can find all the code on GitHub.

Photo credit: Melissa Hough

Flood Monitoring with Satellites

Mon, 09/15/2014 - 14:00

Categories:

Drupal

Last week monsoon rainfall caused flooding in areas across India and Pakistan. Srinagar experienced severe flooding and relief efforts are now underway. Satellite imagery offers a first response look at the flooded area and shows when flood lines increase or receed.

We recently released landsat-util, a tool for easily processing open Landsat imagery. Using landsat-util and some additional processing we can quickly process imagery to view flood extent from above.

Swipe between August 25 and September 10 Landsat images for a quick first glance at the geographic extent of the flood. View larger

We used Landsat-util to identify, download, and process Landsat imagery before and immediately after the floods. (The left image is Scene LC81490362014237LGN00 from August 25, 2014. The right image is Scene LC81490362014253LGN00 from September 10, 2014.)

With additional processing we can get a clearer view of the floodlines. Water reflects infrared light diffirently than land. We can use this to clearly distinguish muddy water from muddy land. Working from the images we just downloaded, we created a false color composite by combining different near-infrared and mid-infrared bands (also known as a 5,6,4 band composite). The processed image clearly highlights flood lines.

A 5,6,4 false color image clearly distinguishes waters from land to derive a flood line. View larger

First response

Timely response is vital in disaster scenarios. When ground information is limited, satellite imagery can provide first responders with a clear picture of an area. With landsat-util we hope to make landsat imagery accessable to more organizations. For more on image processing, see this great tutorial by the Mapbox satellite team and view the code for the false color comparison here.

Power tools for Satellite Imagery

Fri, 08/29/2014 - 16:00

Categories:

Drupal

Our love affair with Landsat is well documented. Today we are sharing the Landsat love with landsat-util, a command line utility that makes it easy to search, download, and process Landsat imagery. We hope these tools help NGOs, small government agencies, and researchers to benefit from open satellite data.

The Landsat Program has provided continuous imagery of the earth to the public since 1972. The newest Landsat satellite, Landsat-8, has sophisticated sensors like thermal infrared, which we use to detect fires, and near infrared, which we use to measure vegetation health. Landsat-8 has collected nearly two petabytes of open imagery data. This is an incredibly powerful data source for NGOs, researchers, municipal governments, and government agencies in developing countries. It is useful for everything from urban planning to detecting the effects of climate change.

Landsat data is still difficult and time consuming to work with. The same NGOs and small government agencies that stand to benefit most from Landsat data often lack the specialized technical expertise to process it. Over the past few months we've built tools to automate our own work with satellite imagery. It once took us all day to collect, georeference, composite, color correct, and pan-sharpen imagery. Now, we can do it in a matter of minutes. We've packaged our processing scripts into a command line utility. Landsat-util makes it easier for other developers and organizations to work with open satellite imagery.

Landsat-util does three things well: - It searches loads of Landsat metadata, - It makes downloading easier, - It processes the data, with natural color-correction and pansharpening, and gets it ready for use in Mapbox Studio or your tool of choice.

Searching

Using our landsat-api, you can search all Landsat-8 metadata and find the images you are looking for. You can limit your search to specific date ranges, filter by cloud coverage, and look within specific rows and paths.

Landsat-util also makes it easier to find the imagery for a specific area. You can point it to a local shapefile and landsat-util selects all images that cover your shapefile. If you give a country, landsat-util selects all images that cover that country.

Downloading

Landsat-util uses imagery from Google Storage to download results faster than USGS Earth Explorer. Google, in partnership with USGS and NASA, stores Landsat imagery on its Google Earth Engine servers and offers them to the public for free. Landsat-util automatically downloads all of the SceneIDs that fit your search.

Processing

Landsat-util can do much of the processing required to make Landsat images useful in your project. It generates natural color images that are ready to be used on mapping tools such as TileMill and Mapbox Studio. All images are adjusted for quality, color, and contrast, and have incredible details (pansharpening increases pixel resolution 2x). They are WGS84 Web-Mercator (EPSG: 3857) georeferenced and can easily be added as a layer to web-based maps.

The power of the command line

If you know exactly what you are looking for you can search, download, color-correct, and pansharpen all with one command.

landsat search --download --imageprocess --pansharpen --cloud 4 --start "january 1 2014" --end "january 10 2014" pr 009 045

Turks & Caicos Islands, British West Indies

You can preview images before you download. Search commands provide a link to a thumbnail for each image.

landsat search --cloud 4 --start "August 1 2013" --end "August 25 2014" country 'Vatican'

Using the --pansharpen flag will take longer to process but will produce clearer images.

landsat search --download --imageprocess --pansharpen --cloud 4 --start "august 11 2013" --end "august 13 2013" pr 191 031

Vatican/Rome, Italy - slide between images to check out pansharpening in action

You can also perform all processing on images that you previously downloaded.

landsat download LC81050682014217LGN00 landsat process --pansharpen /your/path/LC81050682014217LGN00.tar.bz

Gurig National Park, Australia

landsat download LC82310622014187LGN00 landsat process --pansharpen your/path/landsat/zip/LC82310622014187LGN00.tar.bz

Manaus, Brazil

For more possibilities, check out the documentation.

Limitations

Landsat-util uses a number of image processing tools that are very powerful but also very resource hungry. The image processing functions consume a good amount of memory (RAM) and it might not work on computers that have less than 6GB of ram. Using landsat-util also requires some other applications and libraries such as GDAL, ImageMagick and Orfeo Toolbox.

Mac users can install landsat-util and all dependencies through a simple brew command. We have provided a walkthrough for Ubuntu users. For other systems we provide a list of required dependencies.

Open Source

Landsat-util helps us in our own satellite imagery work and we believe could help others run smarter, faster and better analysis and research using satellite products. Let us know what you think and contribute to the repo.

Exploring space faster

Thu, 08/21/2014 - 16:00

Categories:

Drupal

Exploring events from space is going to get a little easier. Below is a sneak peek of a simple and smart utility we're working on to save ourselves some time searching, downloading, and processing Landsat imagery.

Turks & Caicos (path: 09, row: 045, id: LC80090452014008LGN00)

We regularly use satellite imagery to better understand global events. Landsat-8 is a satellite we love because it is open data that is regularly updated. We think this tool will help other developers and organizations to work with open satellite data. Check out below a few of our other favorite spots we've processed through the tool.

Palestine and Israel (path: 174, row: 038, id: LC81740382014188LGN00)

Yamal Peninsula, Siberia (path: 168, row: 010, id: LC81680102014178LGN00)

Amazon Delta, Brazil (path: 225, row: 060, id: LC82250602013174LGN00)

Karachi, Pakistan (path: 153, row: 43, id: LC81520432014018LGN00)

Extracting building height from Lidar

Thu, 08/07/2014 - 11:00

Categories:

Drupal

Lidar is similar to radar...but with lasers. Lidar produces incredibly accurate and specific spatial information, making it great for measuring building heights and modeling the impact of floods and raising sea levels. Advances in lidar technology are making lidar data more available. Humanitarian drones may make it possible to collect up-to-date lidar data in the midst of a humanitarian crisis.

Lidar data is still complicated to work with due to its size, but can be managed entirely through open source tools. We extracted building heights for a neighborhood in San Francisco to demonstrate how this can be done using open lidar data and open source tools.

Extracted building height rendered in 3d using Tilemill.

Understanding vertical growth of a city is as important as understanding its horizontal growth. Building heights are crucial data for everything from disaster response to measuring economic growth. Here we demonstrate how to create a building-height footprint using open data and open source tools.

Open Source Tools and Open Data

We used libLAS, PostGIS, and QGIS to extract building heights in the Richmond District of San Francisco.

You can download open lidar data from USGS Earth Explorer. In this example we combine that with building footprints from OpenStreetMap.

Processing Lidar

We used libLAS, an open source lidar library, to process a lidar point cloud that we downloaded from USGS and converted the lidar data into a .txt file for import into a PostGIS database. PostGIS is great for handling processing-intensive data, like lidar.

We then pulled building footprints of San Francisco from OpenStreetMap and loaded this data into our PostGIS database. By joining the lidar data with OSM building footprints we can determine accurate elevations of the top of each building.

Lidar data is extremely dense. There are thousands of data points in the space representing a city road between buildings and thousands more in each building footprint.

But we want building height. As it is, our data won't distinguish between a short building on a hill, and a tall building in a valley. For each building we construct a two-metre buffer around every building footprint and determine the lowest area in that buffer. By subtracting the elevation of the building from the lowest elevation of the buffer we can determine the building height and append this data to each building footprint. This is how we convert raw data into useable information.

Here are our step-by-step notes that you can replicate to extract building heights from your own lidar data.

Once we have the building heights we can render the buildings in 3d using Tilemill and Mapbox. We use Tilemill's building symbolizer to visualize building height and produce a 3d render of each building. This Mapbox tutorial provides an excellent overview of getting started with visualizing buildings in 3d.

Measuring Building Height

As rapid global urbanization continues in cities around the world, the need for intelligent city design will be crucial in accomodating growing populations. Vertical sprawl can help us understand the needs and surplus of vertical growth across a city. The Burj Khalifa stands at 830 metres and is regarded as the 'Building of the Century'. How could this vertical space have been better distributed to better suit the needs of residents? Lidar can help us understand the urban landscape, and begin to question vertical sprawl as we do horizontal sprawl.