Version 90 (modified by tec, 9 years ago) (diff) |
---|
Archiving projects with NEODC
This page documents the procedure to follow when sending data to NEODC.
- Choose a project:
- A project is ready to be archived when all sensors have been delivered.
- Check the ticket and make sure there is nothing on there which suggests something still needs to be done with the dataset (ask the processor if need be).
- Record in ticket that you are beginning to archive this flight.
- If there is a workspace version, move it into workspace/being_archived (pre-2011 only)
- Prepare the repository version for archiving:
- Run proj_tidy.sh to highlight any problems:
proj_tidy.sh -p <project directory> -c
Check the output. Delete hidden(not .htaccess or .htpasswd)/temporary/broken files. Fix incorrect file name formats and any other obvious errors. Everything in the delivery should be close to perfect, but don't worry too much about things in the main project. - Make sure that the delivery sensor data is present and the raw data is present (incl. navigation). Run a quick eye over the rest of the deliveries.
- Remove unwanted large files. The project should have been cleaned up by the processor but often large files remain which are not needed. In particular, there are sometimes duplicates of data in processing/<sensor> which are included in delivery. Free up as much space as possible by deleting unwanted large files. Don't delete anything in processing/kml_overview
- Add a copy of the relevant trac ticket(s) and zip; run:
mkdir -p admin/trac_ticket pushd admin/trac_ticket wget --recursive --level 1 --convert-links --html-extension http://arsf-dan.nerc.ac.uk/trac/ticket/TICKETNUMBER zip -r arsf-dan.nerc.ac.uk.zip arsf-dan.nerc.ac.uk rm -fr arsf-dan.nerc.ac.uk popd
- Set permissions:
- Remove executable bit on all files (except the point cloud filter and the run[aceh] scripts):
Note - if you are processing 2011 or later, you will need to run the below commands as both arsf and airbornefind -type f -user `whoami` -not -wholename '*pt_cloud_filter*' -and -not -regex '.*/run[aceh]/.*sh' -and -perm /a=x -exec chmod a-x {} \;
- Give everyone read permissions (and execute if it has user execute) for the current directory and below:
find -user `whoami` -exec chmod a+rX {} \;
- Remove executable bit on all files (except the point cloud filter and the run[aceh] scripts):
- If there are multiple deliveries for a sensor (apart from the APL reprocessing), then put all but the newest version in a subdirectory called previous_deliveries. Make sure the newest version is a complete delivery. Fill in missing data with hardlinks using cp -nl if necessary. Ask for help if not sure.
- Make sure the delivery directory is in the top level (otherwise use finalise_delivered_data.sh)
- If you move the delivery directory the KML overview links need to be updated to the new delivery folder location
- Delete the broken symlinks and run make_kmloverview.py
- Check the links in the processing/kml_oveview folders or use the link on the wiki to open the google earth KML file and check the download links work and that there is data under each of the tabs
- Run proj_tidy.sh to highlight any problems:
- Upload the data to NEODC using neodc-archive.py. This will upload the data using rsync, if you are not running as airborne and haven't set up a password file you will need to enter your password (listed on the Passwords page).
neodc-archive.py --year 2012 --jday 324
- If for any reason rsync fails you can also upload the data via FTP (note the general password, not the rsync one, is used) using:
lftp arsfdan@arrivals.ceda.ac.uk > mirror -R -L /tmp/neodc_fEYiiE/ET12_14-2012_324_Boset_raw
- Notify NEODC the data has been uploaded (current contact is: wendy.garland@…. cc arsfinternal).
- Record in the ticket that it has been uploaded to NEODC and include the date.
- When NEODC confirm they have backed up the data:
- Note in ticket that it has been backed up by NEODC.
- Change status page entry to "archived" for relevant sensors (if pre-2011 HS then only do this if both original and reprocessed HS have been archived).
- If workspace version present, delete from being_archived.
- Add the flight in the appropriate format and in the appropriate position in ~arsf/archived_flights.txt
- If all sensors have been archived (including reprocessing) then close the ticket. Otherwise note why the ticket is being left open.