Sending off the year 2015, we present to our readers the mapmakers who contributed their work to the 2015 GeoHipster calendar.
Q: Tell us about yourself.
A: I’m an Engineering Geologist with the California Department of Water Resources in the Division of Integrated Regional Water Management’s North Central Region Office, in the Geology and Groundwater Investigations Section. I like helping people collaborate and work together better, GeoHipster, URISA, @womeningis, volunteer teaching deaf folks, upright bass, competitive strongman lifting, live music, coffee, and photography!
Q: Tell us the story behind your map (what inspired you to make it, what did you learn while making it, or any other aspects of the map or its creation you would like people to know).
A: I made this map years ago during my very first GIS class! This project taught me to be creative with my data sources. I found this data through an obscure place for people to upgrade their GPS unit data. It was fun to play with color and textures for this map!
Q: Tell us about the tools, data, etc., you used to make the map.
A: I used publicly available data and the latest ArcMap of that time (9.3).
Martin Isenburg received his MSc in 1999 from UBC in Vancouver, Canada, and his PhD in 2004 from UNC in Chapel Hill, USA — both in Computer Science. Currently he is an independent scientist, lecturer, and consultant. Martin has created a popular LiDAR processing software called LAStools that is widely used across industry, government agencies, research labs, and educational institutions. LAStools is the flagship product of rapidlasso GmbH, the company he founded in 2012. Martin’s ultimate goal is to combine high-tech remote sensing and organic urban farming in a “laser farm” that promotes green projects as hip and fun activities for the iPad generation.
Q: Thanks, Martin, for taking the time to have a chat with GeoHipster! Tell us about your ideas on “Front yard chickens”. Chickens are so awesome. We’ve heard that your chickens were about to be equipped with lasers. How did that go?
A: Happy to talk geospatial chickens. See, in the backyard chickens are a fun way to be green. But put three (not twenty!) in the front yard, and they create green communities. You meet your neighbors (dragged to your fence by their kids), and soon you are bartering eggs for kale because Lori and Dan across the street now have “front yard veggies”. And why lasers? Not for arming or roasting the chickens (common mistake), but for filming them in real-time 3D. Sort of like Radiohead’s video “House of Cards”, but better. A “happy feed” of urban farm bliss to lure folks beyond this neighborhood into green fun. Troubles over “cluster-bombing” the homeland with garden-fresh zucchinis forced me to put this project on hold.
Q: Can you give us an overview of LiDAR and how it works?
A: Fire a really short burst of laser light, and measure the exact duration until its reflection comes back. That allows you to compute the distance to whatever object was hit. Record the exact position from where and the exact direction in which you fired the laser, and you can calculate the exact position of these hits. Repeat this several hundred thousand times a second with an airborne LiDAR whose laser beam sweeps out the terrain, and you get enough information to model ground, buildings, and vegetation in 3D.
Q: How has LiDAR data storage evolved over the years?
A: The LiDAR points were first stored and exchanged as plain text files: x, y, z, intensity. But ASCII becomes inefficient as a storage format as point numbers go up. Several industry players got together and created a simple binary data exchange format — the LAS format — that was eventually donated to the ASPRS. LAS has become a huge success, and everybody supports it. Nowadays the specification is maintained by the LAS Working Group (LWG) of the ASPRS. That sounds fancy, but is really just a dozen or so email addresses that get cc-ed when an issue is discussed.
Q: You explained that LiDAR data is huge. How much data are we talking about?
A: One LiDAR return — how the hits are called — is typically 28 bytes. An airborne survey with 4 shots per square meter may average 8 LiDAR returns per square meter. For a small area of 100 square kilometers this is over 20 GB of raw LAS. Subsequent processing steps often create multiple copies of this data. Nowadays countries either have, or are going for nation-wide LiDAR coverage. So many terabytes of LiDAR are already out there, and petabytes are going to come.
Q: You developed the compressed LAZ format. Can you give us some background?
A: I spent years of graduate work on compressing polygon meshes, but few people have such data. When I stumbled upon folders of LAS files, I figured having a compressor for these point clouds may actually be useful. I wrote the LASzip prototype mainly to supplement an academic paper, but people found it on my web pages and used it. In 2010 I was asked to release LASzip with an open license to defeat a proprietary format that federal agencies feared would make compressed LiDAR costly. Eventually the US Army Corps of Engineers (USACE) sponsored the open-sourcing of LASzip.
Q: What is the development process that you use for making changes to the compressed LiDAR format?
A: I am very careful with changes, and try to be as transparent as possible about them. First I seek community input on new features via the “LAS room” and the “LAStools” forums. Once discussed, the new feature is implemented as a prototype for testing. If the prototype proves itself over time, these features are moved into the new release. But maybe the time has come to make LASzip an official standard with a committee overseeing future changes.
Q: The release of LAS 1.4 means new point types. This is a disruption of the LAS format in general. What opportunities does this present for LASzip?
A: I have held back extending LASzip to LAS 1.4. Like you say, the new point types in LAS 1.4 are a “natural break” in the format that offers the opportunity to improve the compressor without creating incompatibilities. An open “call for input” was issued to get feedback on features the community wants to see in the next generation of LASzip.
However, LASzip can already compress LAS 1.4 content. NOAA stepped up to sponsor the “LAS 1.4 compatibility mode” where new point types are re-coded into old ones by storing their new attributes as “extra bytes”. Added bonus: many older software packages can read re-coded LAS 1.4 content without upgrade.
Q: You’ve released a simpler interface to LASzip in 2013. How did that turn out?
A: When Esri came knocking, saying the LASzip code was too complicated, I asked them to sponsor the effort for a simpler DLL. But then I decided to create this DLL without further delay. LAZ was the de-facto LiDAR compression standard, and I wanted to remove any possible hurdle for its adoption. It is disappointing that Esri has still not added LAZ support to ArcGIS. The new DLL was essentially written for them.
Q: Recently Esri announced a variation of the open LAS standard called “Optimized LAS”. Can you describe the changes to how LAS files are supported?
A: The name is rather misleading. “Optimized LAS” is a closed format that compresses the content of a LAS file into a proprietary file. The resulting ‘*.zlas’ files are very similar to the ‘*.laz’ files produced by LASzip, which is why the new Esri format is also known as the “LAZ clone”.
Q: So from your view, how do they stack up?
A: The technical differences between LASzip and “Optimized LAS” are minimal. In terms of compression and speed, the two are almost identical. In terms of features, Esri includes spatial indexing information into the *.zlas files whereas we had been storing it as separate *.lax files. It took just a couple of hours to “upgrade” LASzip to match the feature set of “Optimized LAS” by adding one option for spatial sorting and another option for integrating the spatial index into the file. The argument that Esri could not use LASzip due to missing features is obviously a dud. The “LAZ clone” was created to tie LiDAR folks long-term to the ArcGIS platform.
Esri likes to point out that their format contains point statistic summaries. This is so trivial that any developer could add this in an afternoon. Such summaries are a good idea. I encourage Esri to propose a new “Variable Length Record” for that purpose as an addendum to the LAS specification. This is why they are part of the ASPRS LAS Working Group.
Q: Some readers may have seen this post back in 2014, believing the LASzip controversy was resolved. The post was your April Fools’ Day prank. Why did you do it?
A: I modified LASzip just a few days before April 1st 2014 to feature-match the “LAZ clone”. The triviality of these modifications made it obvious that further technical reasoning with Esri was moot. My last hope was to show Esri management how much applause they would garner from working with the community. So I wrote a prank press release, stating that Esri and rapidlasso were developing a joint compressor. Almost everything in this press release was true, except that Esri had not agreed yet to such cooperation. The response was incredible as the collected comments show…
Q: What are the ramifications of dueling data formats? What’s the point the entire GIS community at large should take home?
A: The instant loser is the user who will have to convert data back and forth. The instant winners are companies that provide data converters. Hey wait, that includes me! 🙂 The long-term loser is the GIS community that will find more and more LiDAR locked to a single platform. The long-term winner is the provider of this platform (or so they hope).
Q: You just mentioned processing LiDAR in a web browser. Is that a new thing?
Q: The term geohipster is bestowed affectionately. With your urban farming and your front yard chickens, I feel like using Steven Feldman’s geohippy term. Do you feel more like a geohipster or a geohippy?
A: The original idea behind downtownfarm — the mash-up of chickens and lasers — was to get away from the granola-hippy image of urban farming and make green more trendy and cool. How about geoyuppy?
Q: Before we let you go, is there anything you’d like to add for our GeoHipster readers?
Mano Marks is a Staff Developer Advocate on the Google Developer Platform team. He works to help developers implement Google’s APIs in their applications. He has a Masters in History, and another in Information Management and Systems. His career has taken him from database management at non-profits, to keynote addresses at Google Developer Days around the world. Mano has been with Google for 8.5 years, and was the founding member of the Maps Developer Relations team, working back then with KML and then the Maps API. Now he works across the Google Developer Platform. You can find him on Google+, Twitter, and Github.
Interviewer’s note: In 2013 CalGIS had the privilege of getting Mano Marks (@ManoMarks) to speak at our conference. Since then, I’ve found out how much more of a geohipster he was than I realized at the time. Thanks, Mano, for spending some time answering questions for the GeoHipster readers!
Q: You have degrees in history as well as in information management and systems. How did you get into the geospatial universe?
A: Of course I’ve always loved maps. Who doesn’t? When I was a kid, I had a subscription to National Geographic, and I pored over the maps trying to understand them. I was really into games, role playing games and board war games, which were really map-related. Match that with my Masters in History, where I focused on Eastern Europe, where the map was constantly changing, and I was set up to try to crave knowledge of the world from a spatial point of view. I just never considered it from a career point of view.
I got my Masters from the School of Information at UC Berkeley in 2006. At the time, XML was the major data interchange format and I spent a lot of time understanding the XML universe and document construction. So when I started at Google on what became the Developer Relations Team, they had me work on KML. So I backed into it, but as soon I was there, I started learning everything I could.
Q: One of the neat things about the geohipster community is how diverse we are. You’ve been with Google for more than eight years now, what do you do with them?
A: I work on the developer relations team, helping developers learn how to use Google’s developer platform in their applications. This resulted in spending a lot of time on the road for a few years, talking to tens of thousands of developers around the world. One trip in 2011, I literally flew around the world over the course of a month, from San Francisco to China to Australia, Tel Aviv, several stops in Europe, and then home to San Francisco.
Recently, I’ve worked more internally, helping out other members of the team and working on code samples. I helped out on this project, which shows developers how to create sites using JSON-LD, Web Components, and Schema.org markup. Of course there’s a strong mapping component to it.
Q: In times past you have functioned as a liaison between developers and geofolk. If you could give advice on how these two groups could better interact together, what would you say?
A: Honestly, I’d say to geofolk it’s time to learn how to code. There will always be a place for people who are GIS specialists. And, more and more GIS-only folks are getting left behind by focusing on just using complex applications to create a map that is divorced from everything around it. The map is important — it’s a star in whatever platform you’re using. But it’s just a piece of what’s going on. Location, identity, interaction, and more are where people are spending their time. The vast majority of developers using maps don’t want to know how the maps technology works, they want to know that it’ll be stable, and provide their users with what they need.
“Over the last decade, what Google has done to build up the public understanding and awareness of maps and mapping, particularly through the web, has been priceless for GIS. They made the inaccessible accessible, and produced a common point of reference to be able to communicate about GIS. “It’s a little like Google Earth” may be one of the most effective GIS conversation starters ever. Whatever may happen to that technology in the future, it will have left an indelible cultural impact.”
She’s right, it was a change in our culture. What do you think is going to be the next thing imprinted on our culture? Any upcoming developments that you’d like to leak on GeoHipster first?
A: Ha ha, yeah…unfortunately I can’t leak anything. And I can say that the core technologies that our platforms are built on are evolving at a rapid pace. We carry around these super computers in our pockets. I’m using a Nexus 6 right now, which is akin to having a small laptop in your pocket, both in power and size. People have talked for years about “location-based apps” but that time has come.
And what’s amazing to me is how much people just expect it. It’s a little like the early days of Google Earth, when people would say to me “My Google Earth is broken. I left my car in the driveway but it doesn’t show up when I zoom in on my house.” People now get confused when there’s a new business that hasn’t shown up yet in their app. I think we’re going to see a lot more of, well, I wouldn’t say “real-time” data in maps, but more up-to-date data.
Q: You put Mountain View on the GeoHipster map. I think of Mountain View for Shoreline Amphitheatre but I drive by Google every time I’m going into the parking lot there. Silicon Valley has been the driver for tech and geo trends and now I might even extend the sphere to the entire Bay Area (San Francisco Bay Area). Do you think your region is going to continue to drive tech and geo trends into the future?
A: I absolutely think that it’ll be a big driver of world tech. Fortunately for Google there are smart people who like to work everywhere. I just spent a year in the Zurich office and loved it. I think you’ll increasingly see developers in countries like Mexico, Brazil, Kenya, and other countries contributing to driving tech.
A: Hmmm…I definitely think that there is a coolness/hipness factor to many new technologies. I don’t think that means they are not important or really good at what they do, but remember when XML was the big thing? Sure, it’s still used a lot, but it’s not growing dramatically. Or PHP? There’s a language whose time in the sun is gone. What I wonder about instead is what is the next HTML? That was the most important game changer, it made creating a presentation easy, super easy. KML did that for geospatial data, to an extent. I’ve seen a lot of people who were not developers create KML files and really get into it. But what’s the next thing that someone who doesn’t really understand programming can get into? What can they use to create something that communicates with millions? That’s the real game changer.
Q: I’ve seen you post cool pictures and photo spheres from your travels. Many of the most hip of the geohipsters have passion projects that they’re able to either incorporate into their work or they work on outside of work. What are you working on right now?
A: You know, the last thing I worked on was the semantic markup plus web components project. I wrote a small reference Node.js app to take arbitrary data from a MySQL database and return it as a JSON-LD feed in Schema.org markup. Yes, Node.js is very hipsterish right now :-). I think the question of transforming data to semantic markup in non-XML format is not well settled. There aren’t great libraries for it — in part because JS developers have so many frameworks already, I think they’re afraid of something complex and potentially slow. Especially if it smacks of XML.
That question interests me, but that specific project is wrapping up, at least on my end. So I’m not sure. I am really interested in photography, games, and old maps. One thing I wish someone would do is develop a really good way to OCR old maps to capture location data that we don’t have any more. I’m not sure that’s me, but if anyone has any ideas that would be great.
Q: Last question, while you’ve got the ear of the geohipster community — do you have anything you’d like to share?
A: Pity the poor developer. Remember that creating a new data format doesn’t solve all your problems. Chances are it just creates more.
Most geohipster types I know code, but if you don’t code, start. And spread the word.
The 2015 GeoHipster Wall Calendar makes a great holiday gift for the geogeek on your list, so pick up a few. The proceeds from the calendar sales will help GeoHipster offset our operational costs, stay ad-free, and maintain independence.
The 2015 GeoHipster Calendar is available for purchase from CafePress. All calendars are made to order (you need to specify January 2015 as Starting Month (as opposed to the default setting — the current month)).
The calendar features maps from the following map artists (screenshots below):
John Van Hoesen
IMPORTANT! The screenshot below is intended ONLY to give an overview of the overall layout — which map goes on which page, etc. When you order the 2015 calendar, you will get the 2015 calendar. You can verify this by reviewing each individual page online before you order.