Updated: Aug 5, 2019
Unmanned aerial systems are a rapidly growing technology that is disrupting the landscape of aerial data acquisition. We now have the ability to quickly deploy LiDAR mapping systems and capture very rich topographical datasets that were previously unattainable due to cost or timing. Today we are going to be investigate an unmanned LiDAR survey over a 2,000 acre job site.
The mission was to get highly accurate, and dense measurements of the bare earth. Trees, shrubs, buildings and other obstacles needed to be removed from the dataset to reveal the surface of the earth. For this challenge we deployed LiDAR, Light Detection and Ranging. LiDAR uses LASER pulses to very accurately measure distances. 100's of thousands of measurements are made every second while the drone scans the terrain. The results are recorded as a point cloud, which can be used to generate a bare earth Digital Elevation Model. In this article we discuss the equipment we used and compare the results to RTK captured ground control points captured throughout the 2,000 acre job site.
For our LiDAR sensor we used the LiDARSwiss Riegl miniVUX LiDAR unit. This sensor captures 100,000 measurements per second and each measurement is capable of recording five returns; this provides powerful ground measurements, even through dense vegetation. The Riegl also attains an accuracy of 15 mm and a precision of 10 mm. The LiDAR sensor is tightly coupled to a Novatel STIM 300 MEMS inertial measurement unit (IMU) and a Novatel GNSS system. For best results we also use a base station GNSS and record data during the whole flight. The base station data, aerial LiDAR, aerial GNSS, and aerial IMU data can be post processed in a process called post processing kinematics (PPK) .
For the drone we fly is the DJI Matrice M600+RTK UAS. The system has a triply-redundant autopilot and leverages real-time kinematics (RTK) for a geospatial flight accuracy of 2 cm.
Data was captured on a 2,000+ acre site in southern Texas. The total area was covered by 30 flights over three days of data acquisition. Each flight began with, and end with, a calibration flight path to ensure highly accurate results from post processing. All data is in state plane coordinates. Thirteen ground control points were captured prior to flights by an independent local surveying company. The key features we use to demonstrate our findings are four GCP placed at the corners of a tennis court in the center of the scanned area.
Using post processing software, the raw measurements from the LiDARSwiss nano VUX system was used to generate the LAS point cloud dataset. Using the LiDAR360 software package (Green Valley Int’l), the individual trajectories were aligned and corrected for any systematic errors present during data acquisition. Once these initial processing steps are complete, the ground points are classified and all unwanted objects, such as vegetation, buildings, and structures, are removed from the point cloud. From the bare earth model, we project into the local coordinate system and use a nearest neighbor approach to calculate the change in Z (height in meters) from control points and LiDAR point cloud.
What we find is the vertical accuracies are all within 2 cm RMSEz. Below is the table showing the height differences in the Z direction from the ground control survey points and the unmanned LiDAR mission. The distance is calculated by finding the closest point to the GCP point and projecting along the vertical axis to calculate the height differences.