Version 18 (modified by mark1, 12 years ago) (diff) |
---|
LIDAR data delivery
Once the data has been processed, it needs to be put into a delivery directory. This is the final file structure in which it will be sent to the customer.
What should be included
- ASCII LIDAR point cloud data
- LAS point cloud data
- pdf version of flight logsheet
- readme file describing the data set + copyright notice
- screenshot of mosaic of all lines (full resolution) and zoomed image with vector overlay
- data quality report and further processing scripts
- DEM of the LIDAR data - usually gridded to 2m resolution.
- Screenshot of DEM
For full waveform deliveries the following should also be included:
- Discrete LAS files
- Full waveform LAS files
- Navigation data (.sol file and .trj file)
- ASCII full waveform extractions - if requested by the PI
Procedure for creating a LIDAR data set delivery
Semi-scripted method
- Create the delivery directory by running make_lidar_delivery.py in the main directory. The .LAS files that have been QC'ed and classified for noisy points should be in processing/als50/las-classified. Check it's created everything correctly. If it fails, create the directory manually as per below.
- If you did not create the DEM and screenshots using the above script (-m option) then create them manually using lidar_intensity.sh?.
You will also need to make a lat long version of the dem (for use with aplcorr) using convert_uk_dem.sh or convert_nonuk_dem.sh appropriately. - Generate the readme file.
- Create a config file for the read me using the generate_readme_config.py script.
- Edit the config file and check all the items are filled in:
- Any remarks about the data should be entered as a sentence in the "data_quality_remarks" section.
- If vectors have been used then the accuracy should be entered in "vectors" (e.g. '5-10' if they're within 5m to 10m)
- line_numbering should contain a space separated list of line names linking the logsheet to the processed files.
- las_files should contain the path to the processed LAS file in order to extract the per flightline statistics
- elevation_difference should contain a list of the elevation differences between overlapping flightlines. Enter the lines numbers and the difference in cm followed by a semicolon e.g 001 002 5; 002 003 4.5; etc...
- All "compulsory" items should contain data
- Create a TeX file. Use the script create_latex_lidar_readme.py -f <config filename>
- This file can be reviewed and edited in any text editor if needed
- Create a PDF file by running latex <tex_filename>
- Review the read me and check carefully to see if it looks OK with all relevant information present
- Copy it to the delivery directory and remove any temporary files. Recommended to keep the TeX file until after delivery checking in case any edits are required
Manual Method
- Copy the template directory over to the project directory. Template directory at ~arsf/arsf_data/2011/delivery/lidar/template
- Move the processed data to the delivery directory
- Move the LAS binary files into delivery/flightlines/las1.0
- REMEMBER THAT THESE LAS FILES SHOULD HAVE BEEN QC'ED AND CLASSIFIED FOR NOISY POINTS
- Rename the files in the convention "LDR-PPPP_PP-yyyydddfnn.LAS" (details in readme).
- run las2txt.sh <delivery/flightlines/las1.0> <delivery/flightlines/ascii>
- OR run las2txt --parse txyzicrna <lidarfilename> <outputfilename> for each file, outputting to the ascii_laser directory (may not work with LAS 1.1).
- You need to create a DEM from the lidar data to include with the delivery. Use lidar_aster_dem.sh and put the output file in a directory named 'dem'. Noisy points (those with classification 7) should not be included in the DEM so remember to specify the -C option. You should also create a second DEM that is suitable for use in aplcor. Use either convert_uk_dem.sh or convert_nonuk_dem.sh depending on where your project is located. This DEM should be name *_wgs84_latlong.dem
- Include a pdf version of the flight logsheet with the delivery
- Make sure correct up to date data quality report (pdf version) is included in docs directory
- Create full resolution JPGs of mosaic of all the LIDAR lines by intensity, a separate one of the intensity with vectors overlaid (if vectors are available) and one of the dem and put in screenshot directory (with lidar_intensity.sh?).
- Generate the readme using as per point 3 above
*Note: Be sure that all the files outputted in the above steps conform to the file name formats specified here
Additional Steps for Full Waveform Deliveries
Create the following folders in the delivery directory:
- discrete_laser - containing two folders; ascii_files & LAS_files
- discrete_laser/ascii_files - move the ascii_laser folder to discrete_laser/ascii_files
- discrete_laser/LAS_files - move the discrete LAS files to here - naming convention LDR-PPPP_PP-yyyydddfnn.LAS
- fw_laser - move the full waveform LAS files to here - naming convention LDR-FW-PPPP_PP-yyyydddfnn.LAS
- navigation - copy the .sol file and .trj file to here - naming convention PPPP_PP-yyyy-dddf.*
- fw_extractions - this should contain the following information
- A folder for each requested area containing the relevant ASCII extractions
- A text file describing the information given in the ASCII files. Template in ~arsf/arsf_data/2010/delivery/lidar/template/fw_extractions/extractions.txt. Use summarise_area.py to extract the extents for each area to enter into readme. Naming convention PPPP_PP-yyyy-dddf_extractions.txt
- A jpg showing the location of the areas on an intensity image. Naming convention PPPP_PP-yyyy-dddf_extractions.jpg
The Read_me will need to be edited to include the above information. A template of the full waveform readme can be found at ~arsf/arsf_data/2010/delivery/lidar/template/fw_readme.txt
If you have hyperspectral data to make into a delivery, go to the hyperspectral delivery page.
If not, or if you've done that already, the delivery is ready for checking.