Version 43 (modified by mggr, 11 years ago) (diff)

--

Data Delivery

Preparation

What should be included

  1. level 1 data that has been mapped to level 3 successfully and verified as correctly geolocated (as far as possible).
  2. a readme file describing the contents of the delivery
  3. flight logsheet
  4. screenshots (level 3)
    • one of each line (no vectors)
    • mosaic of all lines for a single sensor, one with and one without vectors overlaid
    • each screenshot should try to show typical or noteworthy characteristics of the image - e.g. any distortion or error, or a typical part of the scene
  5. (2008+?) a DEM file where we have one and can give it to the user
  6. any ancillary files (spheroid separation grids for non-UK flights, OS GB correction factors, etc)

Scripted Approach

  1. Run the script to convert the level 3 Tiff files to jpgs.
    • This will convert all tiff files in directory to jpegs (keeping the tiffs as well)
      • gtiff2jpg.py -d <dir containg tiffs> -a
    • This will only convert the file named filename
      • gtiff2jpg.py -f <filename> -a
  2. Create the delivery directory using make_delivery_folder.sh
    • make_delivery_folder.sh <main project dir> <year> <proj-code>
    • e.g. make_delivery_folder.sh ~mark/scratch/GB08_35-2008_Example/ 2008 GB08-35
    • This will create the delivery directory and populate it with the your lev1 directory files and the other usual directories. It will not copy the misc directory or separation grids nor a DEM as these are not supplied as default in the delivery. These can be manually copied from the below template directories. The lev1 files will be renamed as follows. If the filename is xxx_gpt0.1.zzz the renamed file will be xxx.zzz. This is based on the default file name structure as example e129a121b_gpt0.0.bil. File names with more than 1 '_' will be renamed incorrectly. (I'll make this more robust at some point).
  3. Copy the misc directory or other files (such as the jpgs) if required for the delivery.
  4. Ensure that an ASCII version of the logsheet exists in the project admin/ directory. To create one, start open office and save logsheet as .txt format. Also ensure no other file in the admin dir as a .txt extension. Run the Read Me generation script
    • create_readme.py -d <main project directory> -D <DEM>
    • This should create a file named ReadMe.txt in your delivery folder. It WILL still need to be edited. Flight line numbers will need to be added to the table at the top of the read me, also any reference to vectors will have to changed as required. Please still read through the final read me to check for errors.

Manual Approach

There are delivery template directories available (copy these and populate):

2006
~arsf/arsf_data/in_progress/2006/delivery_template/
2007
~arsf/arsf_data/in_progress/2007/delivery/template/
2008
~arsf/arsf_data/in_progress/2008/delivery/template/

The delivery data should be created in the project directory named delivery/YYYYMMDD/PROJECTCODE/... (e.g. delivery/20071215/GB07-05 for a delivery of GB07-05 created on 15/Dec/2007).

Necessary comments to make in the readme file

  1. Update anything marked TODO in the template
  2. Appropriate pixel size
  3. Appropriate bands for the sensor
  4. Comments on the quality of the data (accuracy vs vectors, any bad bands, etc) and any specific problems encountered
  5. Include a tested example command to convert the data from level 1 to level 3
  6. Ensure the text file is in Windows format (run unix2dos on it if necessary)
  7. (list of other things to change)

Verification

Once a dataset has been prepared for burning and delivery, a second person should go over it and:

  1. Verify we have the correct PI (check against application, call Kidlington if unsure) and address (application, google, Kidlington)
  2. Check against the logsheet that all data that should be present is.
  3. Check for typos, etc in the documentation.
  4. Ensure the readme and other text files are Windows compatible (use the 'file' command on the .txt files: they should be ASCII with CRLF terminators) (run unix2dos ReadMe.txt)
  5. Test out the level 3 processing command for one or more of the level 1 files (definitely one per sensor).
  6. All level 1 files should be looked at visually (ideally in all bands, though this is only practical for ATM and casi).

Final steps

Burning the data

  1. create an iso image with mkisofs -R -udf -verbose -o /tmp/dvd.iso -V <PROJECTCODE-JULIANDAY> <delivery_directory>
  2. test the iso with sudo mount -o ro,loop /tmp/dvd.iso /mnt/tmp - mounts to /mnt/tmp, go look at it, then unmount with sudo umount /mnt/tmp (yes, that is umount not unmount)
  3. if it's ok, burn to CD/DVD with sudo cdrecord dev=/dev/dvd -sao fs=100m driveropts=burnfree -v /tmp/dvd.iso
  4. clean up with rm /tmp/dvd.iso
  5. check the dvd on another machine (ideally Windows)

Lightscribe

  1. open the CD template from ~arsf/arsf_data/in_progress/2007/delivery/cd_covers/arsf_cd_label_gimp_editable.xcf in gimp
  2. click on the text button in the toolbox, then click the GB... text and edit to reflect the correct project code
  3. save as a .bmp file somewhere, e.g. ~/ipy0708214ef.bmp
  4. on the lightscribe machine (currently pmpc974), run 4L-cli print /dev/sr0 ~/ipy0708214ef.bmp

If it appears to be stuck on "Starting up", it may still be burning - give it time to complete. For a dense full disk picture, wait at least 45mins!

If the drive seems wrong, try 4L-cli enumerate to list attached lightscribe compatible drives.

Preparing hard disks for delivery

You will need to have root permissions before you start.

To create a partition:

Unmount disk run dmesg to find device name (listed at the bottom if just plugged in)

run fdisk /dev/device name (probably sdb)

enter 'p' to print partition table and check you have selected the correct disk
enter 'd' to delete the current partition
enter 'n' to create a new partition
enter 'p' to make new partition the primary partition
enter 'p' to print new partition table – if all seems fine enter 'w' to write

To format the disk:

ensure disk is unmounted again

to be on the safe side, run dmesg again to make sure device name hasn't changed

run mke2fs -j /dev/partition (probably sdb1)

Change permissions:

make writable for everyone – chmod a+rwx /media/disk

Finalising hard disk

Set permissions and owner:

chmod a+rX,a-w -R .
chown root.root -R .

Cover letter

There should be a cover letter with all deliveries. Templates can be found in ~arsf/arsf_data/in_progress/YYYY/delivery/

Trac and website updating

  1. Add a comment on the trac ticket for this project that this data has been delivered (describe what was sent and when).
  2. Also add a quicklook to the ticket.
  3. Update the status database to reflect the delivered data (http://www.npm.ac.uk/rsg/projects/arsf/status/addflight/editflight.php)
  4. Update the page saying who has our disks? (if sent on disk)

Email the PI

  1. email them that you're putting the disk in the post and ask for confirmation of receipt

Subject:

Notification of ARSF data delivery for <PROJECT CODE>

Body:

(note you need to fill in PI, DATE, NOTES, TICKETNO, also mention if you aren't sending all their data at once)

Dear <PI>,

--PICK ONE OF THE NEXT TWO SENTENCES (DVD or HDD)--
This is to notify you that we have just dispatched your ARSF data on a USB hard disk formatted with the Linux ext3 filesystem.  Please let us know when the data arrive so we know not to resend.  We'd appreciate it if you could copy the data onto your system and return the disk to us (see below for address) for reuse.  
This is to notify you that we have just dispatched your ARSF data on DVDs.  Please let us know when the data arrive so we know not to resend.

The delivery comprises:
 - ATM flight lines taken on <DATE>
 - CASI flight lines taken on <DATE>
 - Specim Eagle flight lines taken on <DATE>
 - Specim Hawk flight lines taken on <DATE>

<NOTES - any other notes, including what data is held back>

Information about the processing of the project is included with the delivery, but our internal notes are also available at:
 http://www.npm.ac.uk/rsg/projects/arsf/trac/ticket/<TICKETNO>

Regards,

ARSF Data Analysis Node.
Plymouth Marine Laboratory,
Prospect Place,
Plymouth.
PL1 3DH.
UK.

Email: arsf-processing@pml.ac.uk
Tel: +44 (0)1752 633432
Fax: +44 (0)1752 633101
Web (NERC): http://arsf.nerc.ac.uk/
Web (processing wiki): http://www.npm.ac.uk/rsg/projects/arsf/trac/