Version 21 (modified by anee, 17 years ago) (diff)

--

Data Delivery

Preparation

What should be included

  1. level 1 data that has been mapped to level 3 successfully and verified as correctly geolocated (as far as possible).
  2. a readme file describing the contents of the delivery
  3. flight logsheet
  4. screenshots (level 3)
    • one of each line (no vectors)
    • mosaic of all lines for a single sensor, one with and one without vectors overlaid
    • each screenshot should try to show typical or noteworthy characteristics of the image - e.g. any distortion or error, or a typical part of the scene
  5. (2008+?) a DEM file where we have one and can give it to the user
  6. any ancillary files (spheroid separation grids for non-UK flights, OS GB correction factors, etc)

There are delivery template directories available (copy these and populate): 2006::

~arsf/arsf_data/in_progress/2006/delivery_template/

2007::

~arsf/arsf_data/in_progress/2007/delivery_template/

2008::

~arsf/arsf_data/in_progress/2008/delivery_template/

The delivery data should be created in the project directory named delivery/YYYYMMDD/PROJECTCODE/... (e.g. delivery/20071215/GB07-05 for a delivery of GB07-05 created on 15/Dec/2007).

Necessary comments to make in the readme file

  1. Update anything marked TODO in the template
  2. Appropriate pixel size
  3. Appropriate bands for the sensor
  4. Comments on the quality of the data (accuracy vs vectors, any bad bands, etc) and any specific problems encountered
  5. Include a tested example command to convert the data from level 1 to level 3
  6. Ensure the text file is in Windows format (run unix2dos on it if necessary)
  7. (list of other things to change)

Verification

Once a dataset has been prepared for burning and delivery, a second person should go over it and:

  1. Check against the logsheet that all data that should be present is.
  2. Check for typos, etc in the documentation.
  3. Ensure the readme and other text files are Windows compatible (run unix2dos ReadMe.txt)
  4. Test out the level 3 processing command for one or more of the level 1 files (definitely one per sensor).
  5. All level 1 files should be looked at visually (ideally in all bands, though this is only practical for ATM and casi).

Final steps

Burning the data

  1. create an iso image with mkisofs -R -udf -verbose -o /tmp/dvd.iso -V <PROJECTCODE-JULIANDAY> <delivery_directory>
  2. test the iso with sudo mount -o ro,loop /tmp/dvd.iso /mnt/tmp - mounts to /mnt/tmp, go look at it, then unmount with sudo umount /mnt/tmp (yes, that is umount not unmount)
  3. if it's ok, burn to CD/DVD with sudo cdrecord dev=/dev/dvdwriter -sao fs=100m driveropts=burnfree -v /tmp/dvd.iso
  4. clean up with rm /tmp/dvd.iso
  5. check the dvd on another machine (ideally Windows)

Finalising hard disk

Set permissions and owner:

chmod a+rX,a-w -R .
chown root.root -R .

Cover letter

There should be a cover letter with all deliveries. Templates can be found at: 2006:: 2007::

~arsf/arsf_data/in_progress/2007/delivery_letter-2007-template.doc

Trac and website updating

Add a comment on the trac ticket for this project that this data has been delivered (describe what was sent and when). Also add a quicklook to the ticket. Update the status database to reflect the delivered data