Internalizing LIDAR Data Processing

At GeoCue Group we are involved with customers across the mapping industry, from hardware designers through data collators and data analysts to end users, so we often get asked the question, “how much data processing should I do myself?”  It is a great question.  How much of any given business process you decide to internalize must be a key part of your overall growth strategy.  Unfortunately, we often see companies making one of two classic mistakes when approaching this question about bringing LIDAR data processing in-house.   

The first mistake is to decide to do something just because you can, and being smart engineers and scientists, we all believe we can do LIDAR data processing!  In fact, from a practical point-of-view, this is probably very true.  Most engineering, survey and mapping firms have the technical capability and skills already on staff, or can acquire them by hiring experienced people, to take on LIDAR data processing.  LIDAR data is no more complex than many of the other geospatial data types companies routinely process in-house.  It has some unique aspects to it, but the workflows, tools and techniques are very teachable and can be learned, although there is no substitute for experience.  But, just because you can do a thing does not mean you should do that thing.  For LIDAR data processing, a compelling business case must exist to justify internalizing the process.

Let’s consider the case of a company that is currently subcontracting out all their LIDAR data production.  Typically, they will be receiving geometrically correct, fully classified point clouds as a deliverable.  There are usually two questions such companies ask when looking at what, if any, of that work would be better done internally. First, do we want to and can we afford to get into the data collection business by buying hardware? Second, if we don’t buy a sensor and continue to pay somebody else to collect, how much of the data processing should we do ourselves?  The hardware question is usually driven by larger business considerations than we are discussing here, given the level of capital investment required.  There is also a clear difference between taking on work that involves field data collection and all the logistics that go along with those activities and taking on what is essentially another back-office data processing workflow.  We usually recommend that if you aren’t already doing field work, don’t decide to get into it by starting with LIDAR data collects.  But what to do about the back-office data processing is always an interesting question for any company.  The advantages of bringing LIDAR data processing in-house are often characterized in terms of cost-savings – our subs are charging us way too much! – and schedule control – our subs are always late!

The cost-saving argument can be a strong one, but it requires careful analysis.  When we discuss standing-up a LIDAR data production team of three to five staff, we recommend companies allocate an estimated $65,000 to $95,000 for software licenses, classroom training and updating their IT infrastructure.  The minimum investment, for the smallest operations (single technician, existing IT hardware, limited training) still is going to be in the $20,000 – $25,000 range.  The annual lifecycle cost to maintain this capacity likely will run around 20% per year covering software maintenance, support, and annual training.  So, the five-year capital investment for our 5-person team is going to be around $175,000 or approximately $35,000 per year.  The labor costs are going to be the big variable cost; if you have enough work to keep your new production team busy full-time doing LIDAR data processing, the salary and overhead for a five-person team for the year likely will be significantly larger than your actual capital investment in the software tools.

Unfortunately, it is here that many companies get side-tracked.  They see the large up-front capital investment required for the software and training and struggle to get over that hurdle – because usually someone must be convinced to sign an actual purchase order for this amount! – even though in the long run it is likely the labor costs that will determine the profitability of the venture, not the initial set-up costs.  We often hear from companies that want very detailed breakdowns on pricing and technical capabilities of the software to support their business case but can’t tell us exactly how many people they plan to have working on the data processing or what the annualized labor burden will be.  They focus too much on the software price and not enough on putting the software investment in the context of an overall business case.  Ultimately the actual financial determination in this case is straightforward; if the company is paying more than $35,000 + X per year (where X is the organization’s labor burden based on their projected workload) for LIDAR data processing, they can save money by bringing that data processing in-house.

Control of the data processing, especially schedule control, is the other common justification for internalizing LIDAR data processing.  However, our experience has shown this is often a red-herring.  Poor performance on past projects is more likely to indicate a problem with the choice of subcontractor rather than a process issue.  There is nothing that internalizing LIDAR data processing will do to improve upon best practices. If you do decide to internalize, getting trained on best practices is critical!  We work with the best LIDAR data producers in the world and by applying best practices, being rigorous about workflow management and applying constant quality improvement, they all produce great products on time and on budget.  We firmly believe any company that is willing to invest in the proper software tools and well-trained people can achieve the same results by internalizing the process.  Controlling the data processing does offer the potential to build efficiency improvements into your processes over time that can help reduce delivery schedules, but any credible subcontractor will be doing the same and passing those savings on to their customers anyway.

The second common mistake we see companies make in building their business case for internalizing LIDAR data processing is to delay full implementation or adopt a slow rollout strategy.  LIDAR data processing is one of those activities that benefits greatly from economies of scale and “doing the work.”  Achieving a critical mass of expertise on staff and having a constant workload is very important to a successful internalization program.  Having a plan where staff will work on LIDAR part-time or only at certain times of the year or only on a certain customer’s projects is usually a very high-risk choice.  Even if financially the business case appears strong, we often caution customers that if they aren’t going to truly prioritize LIDAR data processing as a core competency and build a sustainable pipeline of work from Day 1, they may be better off staying with a subcontractor.  Often rather than slowly ramping-up to a successful deployment, they end-up slow-walking down a dead-end path that leaves them with only a bare minimum of internal capability, though having invested heavily in the software tools and training.  In the worst-case scenario, these are the companies that we see exit the LIDAR data processing business after 18-36 months with little to show for their investment.  The best way to mitigate the risk of a stalled or under-utilized deployment is to avoid a piece-meal deployment plan; if the financial business case for internalizing LIDAR data processing is there, then be aggressive!

Get the PDF – Internalizing LIDAR Data Processing

FacebookTwitterGoogle+LinkedInShare

GeoCue Launches GetLidar.com to Support Hurricane Recovery

When natural disasters occur, one of the more pressing needs of disaster recovery teams is access to trust-worthy, pre-event data. Typical needs are for recent aerial images and elevation data, preferably orthoimages and LIDAR data. It can be difficult to find sources for this data, sources that can be easily accessed from any location, trust worthy with respect to data integrity and accuracy, and which provide a simple, straightforward interface to extract and deliver data to local computers for processing. GetLidar.com provides such an access point for data relevant to the areas in Florida, Texas and Puerto Rico heavily damaged by hurricanes Harvey, Irma and Maria.

GetLidar.com provides free and direct access to pre-event imagery and Lidar data including:

  • LIDAR data in both American Society for Photogrammetry and Remote Sensing (ASPRS) LAS format as well as compressed LAZ format.
  • 50 cm orthophotography in US Geologic Survey (USGS) quarter quad format for the Harris county area, provided by the Texas Natural Resources Information System (TNRIS). Collected on behalf of the Houston-Galveston Area Council (H-GAC)
  • 2004 USACE LIDAR and 2015 National Oceanic and Atmospheric Administration (NOAA) NGS Topobathy LIDAR data in LAS or LAZ format from NOAA for Puerto Rico.
  • US Department of Agriculture National Aerial Imagery Program (NAIP) for all areas
  • Landsat 8 data for all areas

Read Complete Article: GeoCue Launches GetLidar.com to Support Hurricane Recovery

Terrasolid: The Workhorse is Still a Valuable Tool in LIDAR Production Shops

As the North American reseller for Terrasolid’s software suite, we get to work with the majority of the LIDAR production shops in the US and Canada.  The Terrasolid suite – TerraScan, TerraModeler, TerraMatch and TerraPhoto – continues to be common-place on the production floor regardless of the type: airborne, mobile or terrestrial.  And increasingly we see UAV operators deploying Terrasolid to assist with their own point cloud workflows, whether LIDAR or imagery based.  The focus of the industry is often on the what is new and different and exciting, on the “latest and greatest” so this week we thought we’d step back from the hype and hoopla and check-in with a long-time user of Terrasolid to see how this old workhorse of the LIDAR production shop is doing these days.

We spoke with Amar Nayegandhi, Vice President of Geospatial Technology Services at Dewberry.  Dewberry has been using LIDAR commercially since 1998 – yes, 1998; Dewberry received the first LIDAR task order from the USGS under the Cartographic Services Contract (CSC) – and is well-known and well-respected in the industry.  GeoCue Group sold our first seat of GeoCue and Terrasolid software to Dewberry more than 10 years ago back in 2007.  Dewberry is also a major user of our LP360 software along with many other commercial software tools that are available on the market; basically, they know their stuff when it comes to LIDAR software.

What is the biggest benefit you get from using Terrasolid in your business?

One of the biggest benefits of Terrasolid software is we can integrate the entire LIDAR workflow into our MicroStation CAD environment.  Our geospatial and engineering professionals have a very good understanding of the CAD environment, which enables us to perform point cloud processing (TerraScan), surface modeling (TerraModeler), and sensor calibration (TerraMatch) directly in the CAD environment.

Of the four modules, TerraScan is the primary point cloud analysis tool; where do you see it helping you the most?

When we first started working with LIDAR data, just being able to load millions of points into our CAD software was a challenge that TerraScan solved for us.  Now data sets are in the billions of points and expectations of basic point cloud functionality has evolved with the times.  Still, the core functions we use TerraScan for haven’t changed much over the years – our biggest benefit is the automatic bare earth filtering using our proprietary macros developed through years of experience in processing LIDAR data in various environments. Some of the newer tools in TerraScan like Groups for spatial object classification or newer surface classifications for pulling ground from noisy UAV data are really helping as well.  Project and data management tools are also big time-savers we often take for granted.

After TerraScan, what module do you find the most critical for your production?

Probably TerraMatch.  Sensor manufacturers have come a long way in having calibration and geometric correction built right into their pre-processing software, but TerraMatch gives us the ability to independently verify and correct the fit of the data.  We often use TerraMatch to calibrate data in a project area that include multiple “lifts” because sensor-manufacturer software does not always produce the best fit over lifts that have variable GPS/IMU trajectory solutions. It is also vital for working with older data sets or subcontractor-provided data where we may have no visibility into the calibration processor – TerraMatch gives us an independent verification of goodness of fit.  For mobile LIDAR data, with the GPS outage concerns and other aspects particular to driving around in a car as opposed to flying over in an aircraft, having a set of tools like TerraMatch for calibrating the laser scanners and the cameras is absolutely mandatory.

Dewberry is a major LIDAR production shop in the US, certainly one of the biggest.  That is a lot of staff and over the years, staff turnover is inevitable.  How do you find the learning curve for Terrasolid for new users?

Well, like most engineering software, there are many, many buttons to learn and concepts to get straight in your head.  We are processing more than 100,000 sq miles of LIDAR data this season, and though we don’t see a lot of turnover in staff, our staff has almost doubled in the past two years due to increased workload. So, we do face this issue of training our new staff, not just in Terrasolid, but also in understanding our entire production workflow. We have noticed that most new users come up-to-speed pretty quickly as we have them undergo an intensive one to two weeks of training and practice immediately after they are hired.  It’s a huge plus if the new hires are already comfortable with the MicroStation environment.  I would say a new user is productively working unsupervised after 30 days.  They won’t be using the power tools or doing the complex workflows such as developing macros, but they will be productive with the basics like doing a bare earth extraction and editing the point cloud.  And one of the hidden advantages of Terrasolid is that, unlike 10 years ago, you can find many candidates in the employment pool with significant hands-on Terrasolid experience already.

Do see an alternative or any new contenders you might want to incorporate in your production to replace Terrasolid?

Well, we do keep an eye on alternative software, and we do have other tools in our shop, which we use extensively; but for now we see no benefit to changing our workflow where we use Terrasolid.   With our investment in the suite of bare-earth extraction macros developed by our analysts for various types of data densities, sensors, vegetation, above-ground features, and terrain, as well has the new and interesting features added regularly to the Terrasolid suite, we believe that Terrasolid is reliable, robust and just works to do what we need it to do.

What’s the most interesting or unusual feature in Terrasolid you personally haven’t had a chance to use but would like to?

TerraStereo?  Viewing point clouds directly in stereo seems like it might have some interesting benefits.

January 2016

Well, first of all, Happy New Year! May you have a happy and prosperous 2016!

As I mentioned in the December 2015 issue of our newsletter, we are streamlining and focusing our business this year. One of the things we are changing is this newsletter. We were trying to reach way too many disparate audiences and not being effective with any, I fear. We included general industry information related to the geospatial arena, aimed at decision makers. In the same issue, we included tool tips for LP360. This is just too broad.

We have decided to make this a newsletter for our user community (hence the change in name). We have reduced the distribution list to members of organizations who own or subscribe to our products and/or services. We will now be focused on useful information about our tools, consulting services, hosted solutions and the like. We will generally keep our content organized by:

LIDAR Production Solutions – Tools and services for folks who collect kinematic LIDAR data and do primary data processing. Within our solution set, this includes:

  • The GeoCue production software suite
  • Terrasolid products

Point Cloud Exploration Solutions – Tools and services for users who exploit LIDAR point clouds and who collect/exploit point clouds from imagery. Within this solution set are:

  • LP360 in its various incarnations
  • LIDAR Server
  • Pix4D Mapper
  • Agisoft PhotoScan

AirGon – This is the area where we are focused on our CONTINUUM solutions for executing complete small Unmanned Aerial Systems (sUAS) metric mapping missions. Technologies included in this solution area include:

  • AV-900 Metric Mapping Kit
  • Reckon
  • AirGon Sensor Package (our RTK/PPK solution)
  • AirGon mine site mapping services

Of course, there is a lot of cross talk between these solution areas. LP360 is often used by LIDAR production companies both for specialized tools such as breakline digitizing as well as managing LAS 1.4 generation and quality checking. The point cloud tools within our Point Cloud Exploration area are key to the AirGon solutions and so forth.   We do a fair bit of custom development services that relate to our key technologies as well as solution-specific consulting services. These tend to span multiple solution areas.

We will move our general marketing (where we are trying to get new customers interested in our technology) out of GeoCue Group User News and move toward channels such as LinkedIn, general advertising and so forth. This will allow us to provide much more specific value to you, our users.

I am very excited about 2016. We have been doing a lot of product and solution planning focused on the above areas. You, our users, will benefit from solutions that are clearly focused on our target areas and offer best of breed technology.

I wish you a good start to 2016!

November 2015

First of all, I have to apologize for us not releasing LP360 at the end of October as I promised in the last issue! We needed to add a feature to LP360 to assist with very dense data sets. This new “Classify by Statistics” point cloud task can be used for a variety of functions, among them data thinning. We also took the time to tune a number of different performance bottlenecks, including clipping contours to a project boundary.

We have entered the services business in a small way. We have encountered a number of mine operators who want to collect maps and volumes of their sites using small unmanned aerial systems (sUAS) but they are not yet ready to internalize the process. To assist with this transition, we now offer flying and data processing services to these customers. We do the bulk of this production in LP360 with a bit in our GeoCue workflow products.

One thing this foray into production is teaching us is the value of removing “clicks” from the production process! We are usually so focused on adding advanced features to our products that we overlook the simple things that can greatly improve the speed of a workflow. For example, we changed the destination class selector in the profile classification tool in LP360 to remember the destination class. This is a simple change that took a developer about 1 hour to incorporate. It now saves a data production technician many clicks in the classification process. You can be assured that we will have a renewed focus on basic productivity going forward!

We are in the midst of product planning for 2016. I think we have some pretty exciting developments in the pipeline. I will highlight a few here; you will be hearing details of these as 2016 rolls out.

On the GeoCue workflow software front, we intend to focus some energy on simplifying the product. When we first brought GeoCue to the market in 2004 (we started building the product in 2003), the average production shop employed technicians who were accustomed to “tool box” software with very complex features that a user could stitch together into a workflow that suited their particular needs (sound a lot like the ArcGIS desktop products, right?). Now we find many organizations who are constructing workflows that they would like to be more “black box” and that just work out of the box. This is not unusual to see this occur as a technology such as LIDAR matures.

Our cloud-hosted products, LIDAR Server and Reckon, will see major capability additions in 2016. LIDAR Server will continue to be the premiere solution for managing and delivering LIDAR data in point cloud format whereas Reckon is the life cycle management and storage environment for mine site sUAS mapping. Already we have added the ability for Reckon to serve as a WMS server for clients such as ArcMap and Autodesk. This allows a mine site engineer to bring up to date site imagery and vectors into these environments without the need to take physical delivery of this voluminous data.

Reckon is our first “subscription-only” product. Hosted in Amazon Web Services, this data portal is evolving into a system not only for reviewing mine site mapping by site and mission date but also for annotating stockpiles and planning the next mission. The huge advantage of this over more traditional means, such as trying to use Google Earth imagery, is that the mine engineer can use the most recent view of the mine (for example, last month’s flight). This is nearly a requirement since the topology of these sites are so dynamic.

We are also modernizing the display architecture of LP360 to take advantage of advanced features in workstations and laptop video hardware. Advanced capabilities such as hardware rendering that were once found only in high end video cards are now common place, even in lower end laptops. The 2016 software base will “discover” graphics features and use whatever hardware capabilities found in the discovery process. We will, of course, support fallback to software algorithms for those machines lacking advanced features.

2015 will be the final release year for GeoCue Group software products to support Windows XP as well as 32 bit operating systems (other than our 32 bit LP360 extension for ArcMap). The “experimental” and final releases of 2016 will support Windows 7 and beyond, in 64 bit only. It has been quite some time since Microsoft ended support for Windows XP. We no longer receive development support for XP in Microsoft Visual Studio (our development environment) and thus must retire support for this venerable operating system.   Many other vendors such as ESRI have already ended support for XP so those still on this system should plan accordingly.

For those of you in the United States, we wish you a very enjoyable Thanksgiving holiday!

Best Regards,

Lewis

October 2015

A special thanks to our customers who attended the LP360 software training that we held at our offices in September. As this core group of customers can attest, a few days invested in training on the latest features and techniques can save weeks of time in production and analysis. I think we all particularly enjoyed the evening social at the Blue Pants Brewery!

Several of us have just returned from a whirlwind three weeks on the road. We attended the American Society for Photogrammetry and Remote Sensing (ASPRS) unmanned aerial system conference in Reno, Nevada at the end of September. We conducted (along with Dr. Qassim Abdullah of Woolpert) the UAS Workshop on the day prior to the conference. We had over 110 participants so the interest in sUAS mapping is only growing.

We next attended the inaugural Commercial UAV Expo hosted by Diversified Communications (the folks who bring you ILMF and SPAR) in Las Vegas, Nevada.   This show had well over 100 exhibitors and about 2,000 attendees. We presented a paper on some of the practical aspects of stockpile volumetrics (sort of a lessons learned overview). I was pleasantly surprised at the number of potential end users who attended this conference. We were constantly busy at our booth discussing mine site mapping with quarry and stockpile owner/operators.  Hopefully it was a mere coincidence but the booth next to ours was a company selling automatic parachutes for multi-rotors!

Many companies who are using point clouds extracted from camera carrying drones are realizing that workflow tools beyond those supplied within the point cloud extraction software are needed to efficiently extract products. We have been seeing a nice uptake of LP360 for sUAS by this set of production companies. Our 2015.1 release (by the end of October, I promise!) includes a few new tools such as an automatic stockpile toe extractor that really speed up these processes.

We have been very heavily involved in collecting mine site surveys using our AV-900 sUAS platform. These engagements have been very enlightening in terms of informing us of the tools that can really make a difference in this type of work. One thing we have paid particular attention to is the frequency with which we are denied access to site areas for placing survey control. Fortunately we have our initial version of a Real Time Kinematic (RTK) positioning system on the AV-900 (we actually use this in Post-Processed Kinematic mode). This allows us to collect mine site data with no control at all (we usually do place some checkpoints to verify accuracy). We have come to realize that this is not a nicety for mine site surveys but rather a necessity.

On the LIDAR front, the USGS 3DEP program continues to gain momentum with a number of new projects underway. An interesting aspect of 3DEP is that the deliveries are required to be compliant with the ASPRS LAS 1.4 format. Both GeoCue and LP360 have been compliant with LAS 1.4 for some time now and offer workflows to realize these delivery requirements.

As we move well into the fourth calendar quarter of 2015, we are heavily engaged in product planning for 2016. I see a continued uptake in the use of small unmanned aerial systems for local area surveys and hence we will continue our rapid pace of tool development for this market. LIDAR continues to be a major data source for base mapping with ever increasing expectations on data density and accuracy. We intend to keep LP360 at the forefront of technology for processing and deriving value from these data. I see cloud based services as a technology that promises to provide a means of controlling capital expenditures as data densities expand. While data transfer speeds remain a problem (e.g. they are much too slow), we are developing some clever ways to use hybrid deployments to reduce this impact.

Until next time, enjoy some fine fall weather!

FAA Exemption, LIDAR Server and a New GeoCue Workflow

We have had a lot of things going on in August from finalizing a new client interface for LIDAR Server to receiving our Section 333 Exemption from the FAA. In addition we have started a new workflow integration project that will see a major image processing system hosted in Amazon Web Services (AWS).

In addition to a number of new software developments, we are embarking on offering cloud-hosted subscription services in several different areas. Earlier this year, we introduced Reckon, our AWS-hosted data management and access system for stockpile volumetrics and mine site mapping. This is a subscription service based on sites and data volumes. It relieves local quarry owners from the burdens of managing on-premises servers for housing digital mine site mapping data. Via a web interface, mine operators can rapidly view site data, download reports and analyze site changes over time. We are very pleased with this system and have already begun to host customer data. You can have a look through a demonstration site at www.airgon.net.

We have also just completed a major update to LIDAR Server. LIDAR Server allows you to store, visualize and distribute point cloud data via a rich JavaScript web interface. LIDAR Server can be hosted on a resident server or in a hosted environment such as Amazon Web Services. LIDAR Server is available as a purchased server software package or as a subscription service. We will soon be enhancing the client-side of LIDAR Server with direct launching of LP360, our workstation-based point cloud exploitation solution. If you are a local government who is receiving LIDAR data (perhaps via the USGS 3DEP), hosted LIDAR Server should be a serious consideration. Your data are securely hosted in AWS and managed by GeoCue. You pay a simple monthly subscription based on the amount of data that we are managing. You can test drive LIDAR Server at www.lidarserver.com. By the way, LIDAR Server is the technology selected by the US Department of Agriculture for their nationwide LIDAR data storage, browsing and dissemination.

On the AirGon front, we have decided to offer mine site volumetrics and topographic mapping services. In support of this, we applied for a Section 333 small Unmanned Aerial System (sUAS) Exemption from the Federal Aviation Administration (FAA) to enable us to fly mine sites. I am pleased to announce that this exemption was approved in August for our very own AV-900 Metric Mapping Kit. If you are a services provider, you may look at us and say “why should I purchase an AV-900 MMK from you? It looks as if you are going to be competing with me!” In reality, our goal is to evangelize sUAS technology to the surface mining community for efficient data collection. We are more than happy to turn collection services over to our service provider customers! It is simply that we have realized that mine site owner/operators want some proof that the technology actually works and a clear path to migrating from their current techniques. In fact, we have a very attractive revenue sharing program with Reckon for our service provider partners.

On the workflow front (the original core business of GeoCue), we have just been awarded a new project to build a complex image ordering, processing and discovery system in Amazon Web Services (are you beginning to see a pattern here?) This system will allow users that are geographically dispersed to participate in all aspects of the workflow. We are honored and excited to have been selected for this development. We have built a number of cloud hosted data management systems. This new project will prove that the time is now for cloud-hosted processing system. You will be hearing much more about this project as it develops.

Finally, we intend to do the formal release of LP360 at the beginning of October. We have added a few new capabilities since the EXP release as well as polished a few interfaces. Immediately following the release of 2015.1, we will be embarking on a major rework to the display subsystem of LP360. We now routinely encounter point clouds with high Z extent as well as very high densities (100’s to thousands of points per square meter). We are working hard to ensure that we remain the most responsive visualization platform for this type data.

Well, this month I see I have focused entirely on our technology (I can’t help it – this is exciting stuff!). Next month we’ll talk a bit of business again.

Best Regards,

Lewis

Development, Windows 10, and EXP LP360 2015.1

We are entering the hottest and most humid part of the year here in Alabama so, like January, this is a good time to stay indoors and do things like system design!

We do a lot of system engineering and development work here at GeoCue. This ranges anywhere from customizations of our GeoCue workflow tools to new (“green field”) developments. I have noticed that more and more frequently we consider cloud hosted environments such as Microsoft Azure and Amazon Web Services (AWS) as our solution platform. Besides fractional scaling (add more when you need it, remove it when no longer needed and pay for only what you use), we really like the data storage options. Prohibitively expensive just a few years ago, you can now consider archiving all of your production data in the cloud. For example AWS offers its Glacier archival storage for less than $125 per Terabyte per year. There is just no way to do this on premise with the level of assurance of no data loss that you can get with AWS. At any rate, cloud deployed solutions make more and more sense as this paradigm matures. It is rather ironic since when my great granddaddy started in this business he was renting access to a remote time share system. The more things change, the more they stay the same!

I just upgraded my workstation-class laptop to Windows 10. Since I was moving from Windows 8.1, this has been a positive experience. I have not yet had a lot of experience with the various capabilities so the jury remains out on which is better for workstation (no Microsoft, we don’t do image and LIDAR processing on tablet PCs !!) – Windows 7 or Windows 10? There is a detailed story of my installation experience in this newsletter.

If you are an LP360 customer on maintenance (thank you very much!), you may have already installed our 2015.1 EXP release (yes, it finally went out the door!). The purpose of the Experimental release is to provide you with an early look at some of the features we are adding to the official release. Examples in the current EXP release include three new major tools:

  • Live View – A completely redesigned, real time interface for filtering the display
  • A Ground Cleaner Point Cloud Task (PCT) – this new PCT (available at the Standard level) is a tool that allows you to very rapidly clean up areas of ground classification that are incomplete (a common problem in delivered LIDAR data)
  • An Automatic Stockpile Toe Extractor PCT – This is a tool still in beta form. It allows you to automatically create polygons at the base of a stockpile by simply selecting a point on the pile. This tool really speeds up volumetric analysis

If you are not currently on maintenance, contact Ashlee Hornbuckle at ahornbuckle@lp360.com and she can assist you with returning to the program.

We will soon be moving our licensing to a cloud-hosted solution. This will make self-service of common licensing operations possible. We’ll first move LP360 and then look at our other products. This will take a bit since we have to do this development in-house.

OK, enough said! I have to get back to work! Have a great remainder of the summer!

Best Regards,

Lewis

LP360 Testing, Metric Mapping Kits and a New LIDAR Server

I spent the week of the Fourth of July at my beautiful retreat on the lovely Tennessee River, ostensibly on holiday. In reality, I was sitting with my laptop at the kitchen table writing magazine articles. I am not complaining though – the view is fabulous!

I was also very busy delaying our experimental release of LP360. I did a complete run-through of our new Live View display filter, sending multiple suggestions for tweaks to the development crew and testing out their new builds. On some days we cycled three builds! This is a real advantage of working from Casa Rio. If I were at the office, they would probably knock me in the head! The effort will be worthwhile – this has turned out to be a very nice tool for quickly modifying the display of point cloud data.

I am pleased to report that sales of our Metric Mapping Kit are beginning to take off (pun intended). The AV-900 MMK is a bundle of all of the hardware and software needed to do local area metric mapping and volumetric analysis. We have now collected test data over a wide variety of sites with much effort expended on analyzing metric accuracy as a function of variable parameters such as control, RTK, stockpile toe definition and so forth. The results are truly stunning. sUAS technology will be a paradigm shift for this type of analysis.

I am also pleased to announce (there will be a press release in the next week or so) an option for filing FAA 333 exemptions for the MMK. If you purchase an AV-900 MMK, we (well, actually our attorney) will file your complete FAA 333 petition for a flat rate of US $1,295.

Speaking of stockpiles, the EXP release of LP360 (I promise we will release this by 15 July!) has a new point cloud task (PCT) for automatically digitizing the “toe” of a “clean” stockpile. Simply click a point on a pile and – voilà – a 3D stockpile toe! This tool is showing great potential and will be refined as we work on the final 2015.1 release. Our goal is to make stockpile collection as simple and repeatable as is possible. This function will be available in the Windows (“standalone”) release of the Standard/sUAS level of LP360 at EXP but will be in the ArcGIS extension by the time we deliver the final 2015.1 release.

Have a look at the latest iteration of LIDAR Server. You can view demonstration data sets by visiting www.lidarserver.com. We have replaced the legacy Silverlight client interface with an all new JavaScript browser. LIDAR Server is a great technology for hosting county-wide LIDAR data deliveries, making them available to constituents for viewing and ad hoc deliveries. For example, if a county engineer needs LIDAR data in the vicinity of a road intersection, she can just digitize an area of interest in the LIDAR Server client and download the dynamically created LIDAR data set to her workstation. We are doing a lot of work on LIDAR Server so there will be more to come!

Speaking of servers, our Reckon volumetric results management system has reached what might be called “version 1.0” ready for use. Reckon is a hosted service, running in Amazon Web Services (AWS). It is aimed at both owner-operators of surface mines as well as service providers. Right now you can experiment with Reckon by contacting us for an account. By the end of this month, we will have a new Reckon web site up with a live, on-line demo; stay tuned!

Keep enjoying your summer – see you in August!

Lewis Graham, CTO GeoCue Group

3DEP, LP360 Toolbox and AirGon

I am looking for the month of May – it seems to have disappeared without a trace!

We recently visited with the Tennessee Office of Information Research (OIR) in beautiful Nashville, Tennessee. The OIR is the coordinating state agency for a USGS 3DEP LIDAR (3 acronyms in a row – not quite a record!) acquisition project. Under this program, the state of Tennessee will be flown at Quality Level 2 (2 points per square meter) over a four year period. The initial collection (slated for this fall) will encompass some 11,500 square miles, covering 27 counties.

3DEP is an excellent opportunity for state and local government agencies to pool their financial (and often technical) resources to obtain point cloud data. By spreading the cost across a spectrum of stakeholders, a surprisingly large amount of data collection can be accomplished.

Our discussions with the OIR led naturally to a conversation about how LIDAR data are used in GIS and engineering departments. We covered the usual suspects such as flood plain analysis, basic 3D visualization, site planning and so forth. By the end of the conversation, I was convinced (as usual) that every single state and local government GIS workstation should have access to a current image and current 3D (e.g. LIDAR point cloud in LAS format) backdrops. Why would anyone find it acceptable to be without a cross-sectional view of their municipal data on an ad hoc basis? Mainly because they have never had this level of information available. You never miss what you have never had!

When we returned to the office, we decided to put together, once and for all, a package of material for folks who are either contemplating acquiring LIDAR data or those who have access to LIDAR data. We will develop use cases and return on investment information for the range of applications that make sense for these data. If you have some novel ideas and particularly case studies, please work with us. Obviously we want to sell more software but we believe a rising tide lifts all boats. We need to get the tide (meaning the understanding and effective use of LIDAR data) rising first!

Speaking of software, we hope to have our experimental release (EXP) of LP360 available for download by the end of this month (June). The developers are doing fine. It is me who always throws a wrench in the delivery schedule – “let’s get return selection added to the new Live View dialog before we release…” Speaking of Live View, this is a new dynamic filter in LP360 that lets you change class, return and flag filtering on the fly. You are really going to like this new feature!

While we try to make features in our tools easy to use, the LIDAR tools on the market still tend to be toolbox oriented rather than workflow specific. For this reason, it is very important to participate in training if you hope to realize a maximum return on your investment. We offer a range of training (and consulting) from web based to on-site. In addition, we have our Huntsville-based LP360 training coming up in the fall.

On the AirGon side of things, we have been talking to a lot of potential clients who can make immediate use of small Unmanned Aerial Systems (sUAS) mapping. We offer a complete helicopter-based metric mapping kit in the AV-900 MMK. This is garnering a lot of interest since it provides a turn-key solution of hardware, software and training for doing jobs that have an immediate high return on investment such as stockpile volumetric analysis. However, we also offer just the piece parts for those who wish to assemble their own system. For example, if you have decided on a small wing type sUAS such as the eBee from SenseFly, LP360 for sUAS is still your best option for extracting volumetrics (anyone who has tried to do a multi-pile site using the point cloud generation software shipped with these systems will readily agree!). In addition, AirGon Reckon is the best product in the market for hosting and delivering mine site orthos and volumetric reports. By hosting our volumetrics delivery system in Amazon Web Services, we relieve you the need to worry about data delivery to multiple offices, data backup and security.

Summer promises to fly by just as quickly as the spring. We are attending a number of conferences such as the ESRI meeting and the Transportation Research Board AFB-80 summer meeting. If you are attending one of these, please look us up. See you in July!