Changes between Version 126 and Version 127 of Procedures/NewDataArrival
- Timestamp:
- Oct 1, 2010, 12:26:37 PM (14 years ago)
Legend:
- Unmodified
- Added
- Removed
- Modified
-
Procedures/NewDataArrival
v126 v127 26 26 * If happy, either re-run with --execute (and optionally --verbose) 27 27 * Each project directory should be re-formatted to the current standard 28 * In each project directory run 'unpack_file_check.py -l <admin/logsheet.doc(.txt)>'29 * This will convert .doc logsheet to .txt, or use the .txt if one available. NOTE to convert to .doc requires ooffice macro30 * Will then do various checks of data against logsheet as listed below. Information will be output to terminal. Important (error) messages are printed again at the end.31 * Check file sizes against a 'suitable' size and also against header file (Eagle + Hawk)32 * Check number of files against logsheet33 * Check number of logsheets34 * Check GPS start/stop times in header file (Eagle + Hawk)35 * Check .raw, .nav, .log, .hdr for each Eagle + Hawk line36 * Check for nav-sync issues - THIS IS BASED ON A FALSE ASSUMPTION AND PROBABLY IS USELESS37 28 38 29 === Non-scripting method === … … 52 43 == Verification == 53 44 45 = Unpack File Check = 46 47 In each project directory run 'unpack_file_check.py -l <admin/logsheet.doc(.txt)>' 48 * This will convert .doc logsheet to .txt, or use the .txt if one available. NOTE to convert to .doc requires ooffice macro 49 * Will then do various checks of data against logsheet as listed below. Information will be output to terminal. Important (error) messages are printed again at the end. 50 * Check file sizes against a 'suitable' size and also against header file (Eagle + Hawk) 51 * Check number of files against logsheet 52 * Check number of logsheets 53 * Check GPS start/stop times in header file (Eagle + Hawk) 54 * Check .raw, .nav, .log, .hdr for each Eagle + Hawk line 55 * Check for nav-sync issues - THIS IS BASED ON A FALSE ASSUMPTION AND PROBABLY IS USELESS 56 57 = proj_tidy.sh = 58 59 In the base of the project directory run proj_tidy:[[BR]] 60 `proj_tidy.sh -d ./ -e`[[BR]] 61 This will check that the directory structures look sensible and tell you where they don't. Correct anything significant that gets flagged up. 62 63 = Final manual checks = 64 54 65 Look at the logsheet and verify that we have copies of all relevant data mentioned there. 55 66 * In some cases, the flight crew may fly two projects back-to-back but enter all the data onto a single logsheet. If so, you may need to split the project directory into two, particularly if there's a large time gap (navigation needs separate processing) or the PIs are different (different delivery addresses/tracking). If you do need to split a project, ensure both copies have copies of common files (logsheets, rinex, etc), but that non-common files are not duplicated (ie. don't include hawk data for part 1 in part 2..). Also note in the ticket what was done for tracking purposes. … … 58 69 59 70 Check the filesizes of all data files (Eagle, Hawk, ATM, CASI) to make sure none are zero bytes (or obviously broken in some way). 60 61 In the base of the project directory run proj_tidy:[[BR]]62 `proj_tidy.sh -d ./ -e`[[BR]]63 This will check that the directory structures look sensible and tell you where they don't. Correct anything significant that gets flagged up.64 71 65 72 == Move to permanent PML data storage == … … 73 80 2. Copy project structure with sym links to ~arsf/workspace. You can either do this manually or with: 74 81 75 fastcopy_arsf_proj_dir.sh <source project directory> <target project directory> 82 fastcopy_arsf_proj_dir.sh <source project directory> <target project directory> 83 76 84 Note: Be logged in airborne when you create the directory and the initial files in the workspace or you won't have write permission. 77 85 * '''Obsolete: Only do this to generate old-style scripts.''' For each project directory run 'generate_qsub.py -d <project directory>' … … 82 90 Run the DEM generating script if a UK flight. `nextmapdem.sh` 83 91 92 If the flight is outside the UK you'll need to process the LiDAR data first or use the SRTM dem (see http://arsf-dan.nerc.ac.uk/trac/wiki/Processing/SRTMDEMs) 93 84 94 Create the calibration sym link. This is done automatically if the unpacking scripts have been run. 85 86 If the flight is outside the UK you'll need to process the LiDAR data first or use the SRTM dem (see http://arsf-dan.nerc.ac.uk/trac/wiki/Processing/SRTMDEMs)87 95 88 96 (Not necessary any more) Run times4grafnav.py script to create a text file of time stamps and put the output into the applanix directory. May be of use for the GNSS processing.