Delivery Checking

Once a dataset has been prepared for delivery, a second person should go over it and perform the checks below. These should be performed as your own user (rather than arsf or airborne), except where necessary. This minimises the chances of data being unintentionally overwritten or deleted.

Note: if this is your first time doing a delivery check, please get a third (experienced) person to perform a second check when you're done

Hyperspectral Data

Includes, Eagle, Hawk, Fenix and Owl

  1. Verify we have the correct PI (check against application, call Ops if unsure)
  2. Check against the logsheet that all data that should be present is.
    • Number of files and sensible file sizes
  3. Check that all other folders and contents are present
    • Only the PDF read me file should be present (no .tex, .out, .aux, .log files)
    • Check that there are ASCII view vector files in the sensor FOV dirs for each sensor (multiple ones if different binning has been used)
    • Note that there should not be a flightlines/fodis directory for deliveries 2013 and later
    • No '*.aux.xml' files (created by GDAL) should be present.
  4. Check for typos, etc in the documentation.
  5. Check that the xml files in project_information and line_information look OK in a browser, and that the information within them looks correct. If Firefox is not displaying them correctly, type about:config in the search bar, and set privacy.file_unique_origin to False.
  6. Ensure the text files are Windows compatible (use the 'file' command on the .txt files: they should be ASCII with CRLF terminators)
    • Running find -name "*.txt" | xargs file from the delivery will help with this.
  7. Test out the readme level 3 processing command for all of the level 1 files, run check_apl_cmd -S -c <path to hyp_genreadme config file> in the hyperspectral delivery directory, outputs tif files to tmp_cmd_check directory, check tif files in gtviewer or tuiview. If they are OK remove 'tmp_cmd_check'.
  8. Check each of the screenshots
  9. Run proj_tidy.sh -c -p <path_to_project>. Check if any of the problems listed apply to your delivery.
  10. Run check_bil_size.py -d <deliverylocation>. This will list any BIL files that don't match the size computed from the respective HDR file.
  11. All level 1 files should be looked at visually using fastQC.
    1. If any pixels appear to be constantly bad (c.f. 2009? dust on lens problems) then return to processor for re-processing calibration using a list of additional bad pixels to mask (see aplcal -qcfailures)
    2. In this case additional information should be added into read me to explain why they are being masked
    3. Check that underflows and overflows tabulated in ReadMe are sensible. Rerun autoQC with different settings or use fastQC if not.
    4. Check the z profile (spectral profile) over some green vegetation and it should look plausible. Check this for each level1 file. For Fenix data you can easily and quickly run Py6S for every flightline and check it by following the instructions on the wiki. Or you can look for some vegetation on FastQC, see here for example spectra.
  12. Check all of the level3 mapped files open and look OK (don't need to go through all the bands)
    1. Make sure that all bands have been mapped (not just 3). You can use check_num_bands.py *bil to check the number of bands for all files.
  13. Check the projection used is correct - especially the correct UTM zone for the data
  14. If the delivery is ready to deliver, update the dataset on the Processing status page to "Ready to deliver".
  15. If everything is ready to deliver, i.e. no more changing of ReadMe files, then in the sensor delivery directory run the script zip_mapped.sh.
    1. This should zip each mapped file and its header into a zip file. This will be done on the grid. If it fails, an email will be sent to the nerc-arf-code address. You should also check the log files named zip* in <project directory>/logfiles before delivering the data.
  16. Add a comment on the ticket saying what problems there are / things that need resolving / things that you have resolved yourself / whether the data is ready to deliver.

Additional Checks for Owl data

  • Currently there are no badpixelmethod files generated for the owl.
  • Also check that detector response is stable across the entire flight by examining the calibration (.rad) files (These are also bil files). If there are dramatic changes in the calibration curves for a particular flight line, make sure that it is not used to calibrate a different flight line. You will also need to double check that the data processed for that flight line look correctly calibrated.

LiDAR Data

  1. Verify we have the correct PI (check against application, call Ops if unsure)
  2. Check against the logsheet that all data that should be present is.
  3. Check that all other folders and contents are present.
    • Only the PDF read me file should be present (no .tex, .out, .aux, .log files)
  4. Check for typos, etc in the documentation.
  5. Ensure the text files are Windows compatible (use the 'file' command on the .txt files: they should be ASCII with CRLF terminators). You can use 'find . -name "*.txt" | xargs file' to check all files.
  6. Check all the LIDAR files open/load, look OK and fit together horizontally and vertically. Either:
    • Use lag to view the data
  7. Run the script check_ascii_lidar which will read in the ascii files and check all points have the correct number of records (9), report the min/max of the time/easting/northings and the number of points classified as noise.
  8. Look at a couple of lines to check that obvious noise has been classified.
  9. Check the DEM contents look OK
    • Check DEM in envi
    • check DEM header resolution is the same as in the Read_Me file - trim the ReadMe hdr resolution if in metres and unnecessary precision (e.g. 2.000124342 metres should be 2.0)
  10. Check the lidar separation from original ASTER data
    • run demcompare.py --lidar /ascii/folder/ -d lidar_patched_dem.dem -l UKBNG/UTM
    • measurements are in metres, check they aren't too big (>2 metres for UK data, >8 metres for UTM) on the masked statistics.
  11. Check the projection used is correct - especially the correct UTM zone for the data
  12. Check that the coverage of the DEM is sufficient for processing the hyperspectral data.
  13. Check the screenshots look OK
  14. Run proj_tidy.sh -c -p <path_to_project>. Check if any of the problems listed apply to your delivery.
  15. Add a comment on the ticket saying what problems there are / things that need resolving / things that you have resolved yourself / whether the data is ready to deliver
  16. If the delivery is ready to deliver, update the dataset on the Processing status page to "Ready to deliver".
    Full waveform deliveries only
  17. Check some of the full waveform LAS files. Either:
    • Use WaveViewer (this needs to be installed on your VM, see Procedures/WindowsBoxSoftwareInstallation? for instructions), scroll through a few files and check that the wave form exists and the peaks follow the discreet points.
    • Test the files for waveform data using laszip:
      • To run on an individual file us laszip -i <input file> -waveform -o <output file>
      • If successful, two files will be produced for each las1.3 file and no errors will be caused.
  18. If fw_extractions folder is present: check the *_extractions.txt file - ensure all the listed folders/files are present. Check a couple of the ascii files to ensure they are readable. Check the *_extractions.jpg looks ok.
  19. Use lasinfo to check the AGC value has been saved in the 'user_data' field (min and max values should be non-zero).

Digital Photography Data

  1. Verify we have the correct PI (check against application, call Ops if unsure)
  2. Check against the readme that all data that should be present is.
  3. Check that all other folders and contents are present.
    • Only the PDF read me file should be present (no .tex, .out, .aux, .log files)
  4. Check for typos, etc in the documentation.
  5. View all of the thumbnail images. Look for any that are very over/under exposed or not relevant to the project. This could include: very bright images, very dark images, images of the instrument bay door, images not overlapping with any of the Eagle/Hawk data
  6. Check one (or more) of the photographs for tagging information: exiftool photographs/FILENAME
  7. View one of the photographs to check it opens OK.
  8. Check file sizes of photographs look reasonable (224M for RCD105, ~ 750 MB for Phase One iXU-RS) and there are no _tmp photo files. For the RCD105, this bash command will list the sizes of the photographs in descending order: ls -l | awk '{print $5}' | uniq | sort There should only be 224M outputted.
  9. Run proj_tidy.sh -c -p <path_to_project>. Check if any of the problems listed apply to your delivery.
  10. Open the kml file in googleearth and check it looks good and that thumbnails are displayed when the push pins are clicked.
    • If thumbnails are not visible open the kml file in a text editor. Check image source is the thumbnails directory and that image filenames are correct.
  11. Check filenames correspond. If more than one project has been flown check that julian day has the letter after it (e.g. 298b). This will be in photograph, thumbnail and eventfile directories. Also check filenames in the eventfile have the julian day letter included.
  12. Add a comment on the ticket saying what problems there are / things that need resolving / things that you have resolved yourself / whether the data is ready to deliver.
  13. If the delivery is ready to deliver, update the dataset on the Processing status page to "Ready to deliver".
Last modified 8 weeks ago Last modified on Jan 24, 2024, 10:28:46 AM