= Delivery Checking = Once a dataset has been prepared for delivery, a second person should go over it and perform the checks below. These should be performed as your own user (rather than arsf or airborne), except where necessary. This minimises the chances of data being unintentionally overwritten or deleted. '''Note: if this is your first time doing a delivery check, please get a third (experienced) person to perform a second check when you're done''' == Hyperspectral Data == 1. Verify we have the correct PI (check against application, call Ops if unsure) 1. Check against the logsheet that all data that should be present is. * Number of files and sensible file sizes 1. Check that [wiki:Processing/FilenameConventions#HyperspecDel all other folders] and contents are present * Only the PDF read me file should be present (no .tex, .out, .aux, .log files) * Check that there are ASCII view vector files in the sensor FOV dirs for each sensor (multiple ones if different binning has been used) * '''Note that there should not be a flightlines/fodis directory for deliveries 2013 and later''' * No '*.aux.xml' files (created by GDAL) should be present. 1. Check for typos, etc in the documentation. 1. Check that the xml files in project_information and line_information look OK in a browser, and that the information within them looks correct. 1. Ensure the text files are Windows compatible (use the 'file' command on the .txt files: they should be ASCII with CRLF terminators) * Running {{{find -name "*.txt" | xargs file}}} from the delivery will help with this. 1. Test out the readme level 3 processing command for all of the level 1 files, run {{{check_apl_cmd -c }}} in the '''__hyperspectral__''' delivery directory, outputs tif files to tmp_cmd_check directory, check tif files in gtviewer or tuiview. If they are OK remove 'tmp_cmd_check'. 1. Check each of the screenshots 1. Run {{{proj_tidy.sh -c -p }}}. Check if any of the problems listed apply to your delivery. 1. Run {{{check_bil_size.py -d }}}. This will list any BIL files that don't match the size computed from the respective HDR file. 1. All level 1 files should be looked at visually using fastQC. 1. If any pixels appear to be constantly bad (c.f. 2009? dust on lens problems) then return to processor for re-processing calibration using a list of additional bad pixels to mask (see aplcal -qcfailures) 1. In this case additional information should be added into read me to explain why they are being masked 1. Check that underflows and overflows tabulated in ReadMe are sensible. Rerun autoQC with different settings or use fastQC if not. 1. '''Check the z profile''' (spectral profile) over some green vegetation and it should look plausible. Check this for each level1 file. You can easily and quickly run [wiki:Processing/Py6S_vs_Hyperspectral Py6S for every flightline and check it by following the instructions on the wiki]. Or you can look for some vegetation on FastQC, [wiki:Procedures/DeliveryChecking/ExampleFenixSpectra see here for example spectra.] 1. Check all of the lev3 mapped files open and look OK (don't need to go through all the bands) 1. Make sure that all bands have been mapped (not just 3). You can use {{{check_num_bands.py *bil}}} to check the number of bands for all files. 1. Check the projection used is correct - especially the correct UTM zone for the data 1. If the delivery is ready to deliver, update the dataset on the [https://nerc-arf-dan.pml.ac.uk/status/edit/edit.php Processing status page] to "Ready to deliver". 1. If everything is ready to deliver, i.e. no more changing of ReadMe files, then in the sensor delivery directory run the script `zip_mapped.sh`. 1. This should zip each mapped file and its header into a zip file. This will be done on the grid. If it fails, an email will be sent to `nerc-arf-code@pml.ac.uk`. You should also '''check the log files named zip* in /logfiles before delivering the data'''. 1. Add a comment on the ticket saying what problems there are / things that need resolving / things that you have resolved yourself / whether the data is ready to deliver. == LiDAR Data == 1. Verify we have the correct PI (check against application, call Ops if unsure) 1. Check against the logsheet that all data that should be present is. 1. Check that [wiki:Processing/FilenameConventions#Lidardeliveries all other folders] and contents are present. * Only the PDF read me file should be present (no .tex, .out, .aux, .log files) 1. Check for typos, etc in the documentation. 1. Ensure the text files are Windows compatible (use the 'file' command on the .txt files: they should be ASCII with CRLF terminators). You can use '{{{find . -name "*.txt" | xargs file}}}' to check all files. 1. Check all the LIDAR files open/load, look OK and fit together horizontally and vertically. Either: * Use lag to view the data 1. Run the script check_ascii_lidar which will read in the ascii files and check all points have the correct number of records (9), report the min/max of the time/easting/northings and the number of points classified as noise. 1. Look at a couple of lines to check that obvious noise has been classified. 1. Check the DEM contents look OK * Check DEM in envi * check DEM header resolution is the same as in the Read_Me file - trim the ReadMe hdr resolution if in metres and unnecessary precision (e.g. 2.000124342 metres should be 2.0) 1. Check the lidar separation from original ASTER data * run [wiki:Processing/demcompare demcompare.py] --lidar /ascii/folder/ -d lidar_patched_dem.dem -l UKBNG/UTM * measurements are in metres, check they aren't too big (>2 metres for UK data, >8 metres for UTM) on the masked statistics. 1. Check the projection used is correct - especially the correct UTM zone for the data 1. Check that the coverage of the DEM is sufficient for processing the hyperspectral data. 1. Check the screenshots look OK 1. Run proj_tidy.sh -c -p . Check if any of the problems listed apply to your delivery. 1. Add a comment on the ticket saying what problems there are / things that need resolving / things that you have resolved yourself / whether the data is ready to deliver 1. If the delivery is ready to deliver, update the dataset on the [https://nerc-arf-dan.pml.ac.uk/status/edit/edit.php Processing status page] to "Ready to deliver". [[BR]]'''Full waveform deliveries only'''[[BR]] 1. Check some of the full waveform LAS files. Either: * Use WaveViewer (this needs to be installed on your VM, see [wiki:Procedures/WindowsBoxSoftwareInstallation] for instructions), scroll through a few files and check that the wave form exists and the peaks follow the discreet points. * Test the files for waveform data using laszip: * To run on an individual file us {{{laszip -i -waveform -o }}} * If successful, two files will be produced for each las1.3 file and no errors will be caused. 1. If fw_extractions folder is present: check the ***_extractions.txt file - ensure all the listed folders/files are present. Check a couple of the ascii files to ensure they are readable. Check the ***_extractions.jpg looks ok. 1. Use lasinfo to check the AGC value has been saved in the 'user_data' field (min and max values should be non-zero). == Digital Photography Data == 1. Verify we have the correct PI (check against application, call Ops if unsure) 1. Check against the readme that all data that should be present is. 1. Check that [wiki:Processing/FilenameConventions#Photographicdeliveries all other folders] and contents are present. * Only the PDF read me file should be present (no .tex, .out, .aux, .log files) 1. Check for typos, etc in the documentation. 1. View all of the thumbnail images. Look for any that are very over/under exposed or not relevant to the project. This could include: very bright images, very dark images, images of the instrument bay door, images not overlapping with any of the Eagle/Hawk data 1. Check one (or more) of the photographs for tagging information: `exiftool photographs/FILENAME` 1. View one of the photographs to check it opens OK. 1. Check file sizes of photographs look reasonable (224M) and there are no _tmp photo files. This bash command will list the sizes of the photographs in descending order: `ls -l | awk '{print $5}' | uniq | sort` There should only be 224M outputted. 1. Run proj_tidy.sh -c -p . Check if any of the problems listed apply to your delivery. 1. Open the kml file in googleearth and check it looks good and that thumbnails are displayed when the push pins are clicked. * If thumbnails are not visible open the kml file in a text editor. Check image source is the thumbnails directory and that image filenames are correct. 1. Check filenames correspond. If more than one project has been flown check that julian day has the letter after it (e.g. 298b). This will be in photograph, thumbnail and eventfile directories. Also check filenames in the eventfile have the julian day letter included. 1. Add a comment on the ticket saying what problems there are / things that need resolving / things that you have resolved yourself / whether the data is ready to deliver. 1. If the delivery is ready to deliver, update the dataset on the [https://nerc-arf-dan.pml.ac.uk/status/edit/edit.php Processing status page] to "Ready to deliver". -------------------------------------