Changes between Version 103 and Version 104 of Procedures/EndUserDelivery

Sep 23, 2009, 5:07:39 PM (14 years ago)



  • Procedures/EndUserDelivery

    v103 v104  
    11= Data Delivery =
    3 == Preparation ==
    4 == Hyperspectral deliveries ==
    5 === What should be included ===
    6  1. level 1 data that has been mapped to level 3 successfully and verified as correctly geolocated (as far as possible).
    7  1. a readme file describing the contents of the delivery
    8  1. flight logsheet
    9  1. screenshots (level 3)
    10    * one of each line (no vectors)
    11    * mosaic of all lines for a single sensor, one with and one without vectors overlaid
    12    * each screenshot should try to show typical or noteworthy characteristics of the image - e.g. any distortion or error, or a typical part of the scene
    13  1. (2008+?) a DEM file where we have one and can give it to the user
    14  1. any ancillary files (spheroid separation grids for non-UK flights, OS GB correction factors, etc)
    17 === Scripted Approach ===
    18  1. Run the script to convert the level 3 Tiff files to jpgs.
    19    * This will convert all tiff files in directory to jpegs (keeping the tiffs as well)
    20      * -d <dir containg tiffs> -a     
    21    * This will only convert the file named filename
    22      * -f <filename> -a     
    23  1. Create the delivery directory using
    24    * <main project dir> <year> <proj-code>
    25    * e.g. ~mark/scratch/GB08_35-2008_Example 2008 GB08-35
    26    * This will create the delivery directory and populate it with the your lev1 directory files and the other usual directories. It will not copy the misc directory or separation grids nor a DEM as these are not supplied as default in the delivery. These can be manually copied from the below template directories. The lev1 files will be renamed as follows.  If the filename is xxx_gpt0.1.zzz the renamed file will be xxx.zzz. This is based on the default file name structure as example e129a121b_gpt0.0.bil. File names with more than 1 '_' will be renamed incorrectly. (I'll make this more robust at some point).
    27  1. Copy the misc directory or other files (such as the jpgs) if required for the delivery.
    28  1. Ensure that an ASCII version of the logsheet exists in the project admin/ directory. To create one, start open office and save logsheet as .txt format.  Also ensure no other file in the admin dir as a .txt extension. Run the Read Me generation script
    29    * ` -d <main project directory> -D <DEM-TYPE> 
    30    *  <DEM-TYPE> the type of DEM used (in capitals).
    31    * This should create a file named ReadMe.txt in your delivery folder. It WILL still need to be edited. Flight line numbers will need to be added to the table at the top of the read me, also any reference to vectors will have to changed as required. Please still read through the final read me to check for errors.
    34 === Manual Approach ===
    35 There are delivery template directories available (copy these and populate):
    36  2006::
    37   ~arsf/arsf_data/in_progress/2006/delivery_template/
    38  2007::
    39   ~arsf/arsf_data/in_progress/2007/delivery/template/
    40  2008::
    41   ~arsf/arsf_data/in_progress/2008/delivery/template/
    43 The delivery data should be created in the project directory named '''`delivery/YYYYMMDD/PROJECTCODE/...`''' (e.g. delivery/20071215/GB07-05 for a delivery of GB07-05 created on 15/Dec/2007).
    45 === Necessary comments to make in the readme file ===
    46  1. Update anything marked TODO in the template
    47  1. Appropriate pixel size
    48  1. Appropriate bands for the sensor
    49  1. Comments on the quality of the data (accuracy vs vectors, any bad bands, etc) and any specific problems encountered
    50  1. Include a tested example command to convert the data from level 1 to level 3
    51  1. Ensure the text file is in Windows format (run `unix2dos` on it if necessary)
    52  1. ''(list of other things to change)''
    53 ----------
    54 == LIDAR deliveries ==
    55 === What should be included ===
    56  1. ASCII LIDAR pointcloud data
    57  1. flight logsheet
    58  1. readme file describing the data set + copyright notice
    59  1. screenshot of mosaic of all lines (full resolution) and zoomed image with vector overlay
    60  1. data quality report and further processing scripts
    61  1. DEM of the LIDAR data - usually gridded to 2m resolution. Use SRTM to fill holes and extend lines.
    62  1. Screenshot of DEM
    64 === Procedure for creating a LIDAR data set delivery ===
    66  1. Copy the template directory over to the project directory. Template directory at `~arsf/arsf_data/2009/delivery/lidar/template`
    67  1. Convert the LAS binary files into ASCII files, ensuring to output all the appropriate information
    68    * run `las2txt --parse txyzicra <lidarfilename> <outputfilename>` for each file, outputting to the ascii_laser directory (may not work with LAS 1.1).
    70    * If not already done, rename the files in the convention "LDR-PPPP_PP-yyyydddnn.txt" (details in readme).
    71  1. Create a full resolution JPEG of a mosaic of all the LIDAR lines + a separate one with vector overlays and put in screenshot directory (with [wiki:Processing/CreateTifs])
    72  1. Go through the readme file and edit as required
    73    * Enter all the project details into the table at the top
    74    * Fill in the contents. Add the lidar filenames on the left and line numbers (from logsheet) on the right.
    75    * Enter the projection and datum details - get these from ALS PP output dialog when processing.
    76      * UK flights should be - Horizontal Datum: "ETRF89" Vertical Datum: "Newlyn" Projection: "OSTN02 British National Grid"
    77    * Insert statistics for each line: `lasinfo --check <lidarfilename>` then cut/paste the required information
    78    * Check the accuracy of data vs vectors and estimate the average offset, also the average elevation offset between neighbouring lines
    79    * Search for 'TODO' and change anything that remains to be done
    80  1. Ensure readme file is in Windows format (run unix2dos on it)
    81  1. Make sure correct upto date data quality report (pdf version) is included in docs directory
    82  1. Include the flight logsheet with the delivery
    83  1. If DEM was created for processing then include in delivery. Else create one (with and include in delivery. Create a screenshot also.
    84 ------------------------
    85 == Verification ==
    87 Once a dataset has been prepared for burning and delivery, a second person should go over it and:
    88 === For Hyperspectral Data ===
    89  1. Verify we have the correct PI (check against application, call Gloucester if unsure) and address (application, google, Gloucester)
    90  1. Check against the logsheet that all data that should be present is.
    91  1. Check for typos, etc in the documentation.
    92  1. Ensure the readme and other text files are Windows compatible (use the 'file' command on the .txt files: they should be ASCII with CRLF terminators) (run `unix2dos ReadMe.txt`)
    93  1. Test out the readme level 3 processing command for one or more of the level 1 files (definitely one per sensor).
    94  1. All level 1 files should be looked at visually (ideally in all bands, though this is only practical for ATM and casi).
    95 === For LIDAR Data ===
    96  1. Verify we have the correct PI (check against application, call Gloucester if unsure) and address (application, google, Gloucester)
    97  1. Check against the logsheet that all data that should be present is.
    98  1. Check for typos, etc in the documentation.
    99  1. Ensure the readme and other text files are Windows compatible (use the 'file' command on the .txt files: they should be ASCII with CRLF terminators) (run `unix2dos ReadMe.txt`)
    100  1. Open all the ASCII LIDAR files into a GIS and check they open/load ok
    101  1. Check overlaps with neighbouring lines for obvious errors
    102  1. Check against vectors if available
    103  1. use tail/head to check the ASCII LIDAR files have all the required columns in the order specified in the readme.txt
    104    * `head <lidarfilename>`
    106 Once the delivery has been checked (and corrected if necessary), the project directory (all of it, not just the delivery) should be rsynced back to the repository:
    107  1. su to arsf
    108  1. cd to the relevant project directory under ~arsf/arsf_data/
    109  1. Run `rsync -av --dry-run <processing_directory> ./`
    110    * This should give you a list of files that it thinks have been updated - check this looks sensible
    111  1. If there are any files in the list that shouldn't be rsynced back (eg. screenshots in the wrong place, test scripts), take them out of the rsync by adding --exclude <exclude_pattern>. Then re-run the dry-run rsync and check that the files to be excluded are no longer in the list.
    112  1. Once you're happy with the list of files to be copied to the repository, re-run the rsync command without the --dry-run argument - this will actually copy the files.
    113  1. Put a comment in the ticket to say where you've rsynced the files back to.
    114 ------------------------
    115 == Final steps ==
    118 If you wish to burn the data to DVD please see notes at the bottom of this page.
     3Preparation and verification steps are on another page now.
    1215=== Preparing hard disks for delivery ===
    19882 1. Add a comment on the trac ticket for this project that this data has been delivered (describe what was sent and when).
    199  1. Update the status database to reflect the delivered data (
     83 1. Update the status database to reflect the delivered data (
    20084 1. Update the [wiki:Internal/Disk_Locations page saying who has our disks] (if sent on disk)
    21498Dear <PI>,
    217101This is to notify you that we have just dispatched your ARSF data on a USB hard disk formatted with the Linux ext3 filesystem.  Please let us know when the data arrive so we know not to resend.  We'd appreciate it if you could copy the data onto your system and return the disk to us (see below for address) for reuse. 
    218102This is to notify you that we have just dispatched your ARSF data on DVDs.  Please let us know when the data arrive so we know not to resend.
     103This is to notify you that we have placed your ARSF data on a FTP server accessible at <FTPADDRESS>.  Please let us know when you have downloaded the data so we can reuse the space.
    220105The delivery comprises:
    221  - ATM flight lines taken on <DATE>
    222  - CASI flight lines taken on <DATE>
    223106 - Specim Eagle flight lines taken on <DATE>
    224107 - Specim Hawk flight lines taken on <DATE>
     108 - LIDAR flight lines taken on <DATE>
    226110<NOTES - any other notes, including what data is held back>
    228112Information about the processing of the project is included with the delivery, but our internal notes are also available at:
     132=== Next stage ===
     134 1. Set the ticket component to "Archiving" and put a comment in the ticket that it's ready for local and NEODC archiving.