Version 1 (modified by mggr, 14 years ago) (diff)


LIDAR data delivery

Once the data has been processed, it needs to be put into a delivery directory. This is the final file structure in which it will be sent to the customer.

What should be included

  1. ASCII LIDAR pointcloud data
  2. pdf version of flight logsheet
  3. readme file describing the data set + copyright notice
  4. screenshot of mosaic of all lines (full resolution) and zoomed image with vector overlay
  5. data quality report and further processing scripts
  6. DEM of the LIDAR data - usually gridded to 2m resolution.
  7. Screenshot of DEM

Procedure for creating a LIDAR data set delivery

Semi-scripted method

  1. Create the delivery directory using <full path to main project dir> <year> <proj-code> <julian_day>. The .LAS files that have been QC'ed and classified for noisy points should be in the main project directory under leica/proc_laser/ for this script. Check it's created everything correctly. If it fails, create the directory manually as per below.
  2. If it is a UK project then dem and screenshots will be created. A screenshot of the intensity image with vectors overlaid will be created if there are vectors in ~arsf/vectors/from_os/PROJ_CODE. If the project is outside the UK then this script will not generate the dem or screenshots so these need to be created manually using
  3. Generate the readme using -d <main project directory> [-l <path to text logsheet>] [-s <path to delivery e.g. .../DATE/PROJ-CODE>]. For this script an ASCII version of the logsheet needs to be in ./admin - if not there create one. Go through the read_me and edit as required (search for TODO)

Manual Method

  1. Copy the template directory over to the project directory. Template directory at ~arsf/arsf_data/2009/delivery/lidar/template
  2. Convert the LAS binary files into ASCII files, ensuring to output all the appropriate information
    • run <lasdirectory>
    • OR run las2txt --parse txyzicra <lidarfilename> <outputfilename> for each file, outputting to the ascii_laser directory (may not work with LAS 1.1).
    • If not already done, rename the files in the convention "LDR-PPPP_PP-yyyydddfnn.txt" (details in readme).
  3. If a DEM was created for processing then include it in the delivery. Noisy points (those with classification 7) should not be included in the DEM. Remove these points using the point cloud filter (~arsf/arsf_data/2009/delivery/lidar/template/bin/pt_cloud_filter/linux64/pt_cloud_filter). Create a DEM with and include in delivery.
  4. Include a pdf version of the flight logsheet with the delivery
  5. Make sure correct upto date data quality report (pdf version) is included in docs directory
  6. Create full resolution JPGs of mosaic of all the LIDAR lines by intensity, a separate one of the intensity with vectors overlaid (if vectors are available) and one of the dem and put in screenshot directory (with
  7. Generate the readme using -d <main project directory> [-l <path to text logsheet>] [-s <path to delivery e.g. .../DATE/PROJ-CODE>]. For this script an ASCII version of the logsheet needs to be in ./admin - if not there create one.
  8. Go through the read_me file and edit as required
    • Enter all the project details into the table at the top
    • Fill in the contents. Add the lidar filenames on the left and line numbers (from logsheet) on the right.
    • Enter the projection and datum details - get these from ALS PP output dialog when processing.
      • UK flights should be - Horizontal Datum: "ETRF89" Vertical Datum: "Newlyn" Projection: "OSTN02 British National Grid"
    • Insert statistics for each line: lasinfo --check <lidarfilename> then cut/paste the required information
    • Check the accuracy of data vs vectors and estimate the average offset, also the average elevation offset between neighbouring lines
    • Search for 'TODO' and change anything that remains to be done
    • Ensure there are no tab characters used (search for \t and remove any that are present)
    • Ensure readme file is in Windows format (run unix2dos on it) and

If you have hyperspectral data to make into a delivery, go to the hyperspectral delivery page.

If not, or if you've done that already, the delivery is ready for checking.