Changes between Version 1 and Version 2 of Procedures/AZEagleHawkProcessing


Ignore:
Timestamp:
Jul 4, 2011, 10:31:32 AM (13 years ago)
Author:
knpa
Comment:

--

Legend:

Unmodified
Added
Removed
Modified
  • Procedures/AZEagleHawkProcessing

    v1 v2  
    1 Once the data have been unpacked and the nav data have been processed, the Eagle/Hawk data need to be run through the AZ Systems processing chain (primarily azspec, azimport, aznav and azgcorr).
    2 Creating the Config file ¶
     1= Eagle/Hawk Processing =
     2
     3Once the [wiki:Procedures/NewDataArrival data have been unpacked] and the [wiki:Procedures/ProcessingChainInstructions/NavigationProcessing nav data have been processed], the Eagle/Hawk data need to be run through the [wiki:Processing/Flow AZ Systems processing chain] (primarily azspec, azimport, aznav and azgcorr).
     4
     5== Creating the Config file ==
    36
    47generate_runscripts.py is the script used to create the config file, which is in turn used to process the eagle and hawk lines. Before beginning, make sure the raw files are present as well as a processed navigation (sbet) file in applanix/proc and a suitable dem (see below). You may also need to create a text version of the logsheet.
     
    69Run with -h flag to get full usage help. An example command is below:
    710
    8 generate_runscripts.py -s s -n 5 -j 196 -y 2010 -l admin/196-10_Eufar10-03.txt Beware that the method is not robust and may fill in the wrong data details, especially for the per-line information. For unknown variables (global or flight line specific) a "?" will be inserted in the config file and should be replaced by hand.
     11generate_runscripts.py -s s -n 5 -j 196 -y 2010 -l admin/196-10_Eufar10-03.txt
     12 
     13Beware that the method is not robust and may fill in the wrong data details, especially for the per-line information.
     14For unknown variables (global or flight line specific) a "?" will be inserted in the config file and should be replaced by hand.
    915
    10 If no log sheet is available, one can use the keywords on the command line to specify the global variables, using "" if requires a space within the name:
     16If no log sheet is available, one can use the keywords on the command line to specify the global variables, using "" if requires a space within the name: 
    1117
    12 generate_runscripts.py -s s -n 4 --JDAY=200 --YEAR=2009 --PI="P.I. Name" --SITE="Over There" --PROJCODE=EX01_99
     18generate_runscripts.py -s s -n 4 --JDAY=200 --YEAR=2009 --PI="P.I. Name" --SITE="Over There" --PROJCODE=EX01_99 
    1319
    14 Adding keywords to the command line and using a logsheet results in the command line keywords taking precedent over the logsheet keyword. For a full list of keywords and their default values run: generate_runscripts.py -h.
     20Adding keywords to the command line and using a logsheet results in the command line keywords taking precedent over the logsheet keyword.
     21For a full list of keywords and their default values run: generate_runscripts.py -h.
    1522
    1623To use the logsheet just for the global variables you can add to command line --NOPERLINELOG. This will then use the logsheet for global variables, but not for per flight line values. For the per flight line values it will use liblogwriter.py and the raw eagle/hawk header and sbet files to extract average values for the speed/altitude/direction. But using this method will not name the scripts by flight line order on the logsheet, but by the filename of the raw data. Use the --NOPERLINELOG parameter if you wish to manually add the per line data to the config file.
    1724
    18 Use the --EXCLUDE="n1 n2 n3 ... -1 m1 m2 m3... -1" parameter if you want to exclude line numbers from the config file. In this case -n linesNo, linesNo can be the number of lines with the previously specified ones excluded.
    19 Processing the Lines ¶
    20 Easy Way ¶
     25Use the --EXCLUDE="n1 n2 n3 ... -1 m1 m2 m3... -1" parameter if you want to exclude line numbers from the config file.
     26In this case -n linesNo, linesNo can be the number of lines with the previously specified ones excluded.
     27
     28== Processing the Lines ==
     29
     30=== Easy Way ===
    2131
    2232You should now have in the root of the project directory a .cfg file named with the year and julian day of the project. In order to run the project on the gridengine with default settings, run:
    2333
    24 specim_qsub.py <cfg_file>
     34`specim_qsub.py <cfg_file>`
    2535
    2636By default this will do timing runs for each line on the gridengine using SCT offsets between -0.1 and 0.1.
     
    2838If you have any problems, check the files created in logs. E.g.
    2939
    30 EUFAR10-03_2010-196_eagle_-2.o293411 EUFAR10-03_2010-196_eagle_-2.e293411
     40EUFAR10-03_2010-196_eagle_-2.o293411
     41EUFAR10-03_2010-196_eagle_-2.e293411
    3142
    3243The first one is an output file (hence the 'o') and the second is the error file ('e'). The last part of the name is the grid node job number.
    3344
    34 Check these for errors (look for stars). Check the errors against the example errors below.
     45Check these for errors (look for stars). Check the errors against the [#Problems example errors] below.
    3546
    3647Now you need to find and record the correct SCT value for each flightline. Look for wobbles in straight lines and try to correct them. Once you have a value for each flightline, enter this for the start and end sct variables in the config file for each flightline. Run once more and the script will no longer delete the lev1s. These are the final product to go in the delivery.
    3748
    3849Check against OS vectors if this is a UK project.
    39 Old-fashioned way ¶
     50
     51=== Old-fashioned way ===
    4052
    4153If you're unlucky and the automated script fails for some reason, you may need to do at least some of the processing the old-fashioned way.
    4254
    43    1. cd to the project directory.
    44    2. Ensure that directories called "logs", "dem", "lev1" and "lev3" have been created.
    45    3. If there isn't one already, create a "calibration" symlink to the calibration data: ln -s ~arsf/calibration/<year> calibration.
    46    4. Create a DEM for the project. If it's in the UK you can use Nextmap data (try running nextmapdem.sh in the project directory to do it automatically, otherwise see the wiki page). If it's non-UK, you'll need to use LiDAR data (you may wish to use this anyway if it's available - see here on how to do this using a script), or failing that SRTM 90m data. Copy it into the dem directory.
    47    5. Create a symlink to the SBET file if there isn't one already.
    48          1. cd applanix
    49          2. ln -s Proc/sbet_01.out apa<year><jday>.sbet
    50          3. cd ..
    51    6. Copy the sample config file from ~arsf/sample_scripts/<year> to the project directory - you need specim_qsub.py, process_specim_line.py and template_specim_config.cfg.
    52    7. Comparing the .cfg file with the logsheet, replace the entries that need to be replaced as appropriate. You should be able to see which bits these are in the sample scripts because they'll have keywords instead of values. You will need to create one cfg file section flightline per sensor.
    53           * Note that dates must be of the form DD/MM/YY or DD/MM/YYYY (must use / as a separator)
    54           * Note that times must be of the form HH:MM:SS (must use : as a separator)
    55    8. Run the processing scripts. You can either do this via the gridengine (recommended) by running specim_qsub.py, or you can do it on your machine one line at a time on your machine by running process_specim_line.py with appropriate arguments for each line/sensor combination from the root of the project directory. If you do the latter you should pipe the output to tee to ensure a log file is generated: rune/e12301.sh 2>&1 | tee rune/e12301.log.
    56    9. Check each set of flightlines to work out which has the best timing offset (ie has the straightest roads, etc). Make a note of the timing offset values in the ticket
    57   10. Check against OS vectors
     55 1. cd to the project directory.
     56 1. Ensure that directories called "logs", "dem", "lev1" and "lev3" have been created.
     57 1. If there isn't one already, create a "calibration" symlink to the calibration data: `ln -s ~arsf/calibration/<year> calibration`.
     58 1. Create a DEM for the project. If it's in the UK you can use [wiki:Processing/NextMapDEMs Nextmap data] (try running `nextmapdem.sh` in the project directory to do it automatically, otherwise see the [wiki:Processing/NextMapDEMs wiki page]).  If it's non-UK, you'll need to use [wiki:Help/LeicaLidarDems LiDAR data] (you may wish to use this anyway if it's available - see [wiki:Processing/CreateTifs here] on how to do this using a script), or failing that [wiki:Processing/SRTMDEMs SRTM 90m data]. Copy it into the dem directory.
     59 1. Create a symlink to the SBET file if there isn't one already.
     60   1. `cd applanix`
     61   1. `ln -s Proc/sbet_01.out apa<year><jday>.sbet`
     62   1. `cd ..`
     63 1. Copy the sample config file from ~arsf/sample_scripts/<year> to the project directory - you need specim_qsub.py, process_specim_line.py and template_specim_config.cfg.
     64 1. Comparing the .cfg file with the logsheet, replace the entries that need to be replaced as appropriate. You should be able to see which bits these are in the sample scripts because they'll have keywords instead of values. You will need to create one cfg file section flightline per sensor.
     65   * Note that dates must be of the form DD/MM/YY or DD/MM/YYYY (must use / as a separator)
     66   * Note that times must be of the form HH:MM:SS (must use : as a separator)
     67 1. Run the processing scripts. You can either do this via the gridengine (recommended) by running specim_qsub.py, or you can do it on your machine one line at a time on your machine by running process_specim_line.py with appropriate arguments for each line/sensor combination from the root of the project directory. If you do the latter you should pipe the output to tee to ensure a log file is generated: `rune/e12301.sh 2>&1 | tee rune/e12301.log`.
     68 1. Check each set of flightlines to work out which has the best timing offset (ie has the straightest roads, etc). Make a note of the timing offset values in the ticket
     69 1. Check against OS vectors
    5870
    59 dem?
     71'''dem?'''
    6072
    61 To return sensible results you will need a dem. One should have already been completed in the unpacking stage. If not, it will need to be created. If the project is in the UK you can use Nextmap data (try running nextmapdem.sh in the project directory to do it automatically, otherwise see the wiki page). If it's non-UK, you'll need to use LiDAR data (you may wish to use this anyway if it's available - see here on how to do this using a script), or failing that SRTM 90m data. Copy it into the dem directory. Once you've done this include the DEM file in the config file by entering "dem=<dem_file_name>" under the DEFAULT section.
    62 Problems ¶
     73To return sensible results you will need a dem. One should have already been completed in the unpacking stage. If not, it will need to be created. If the project is in the UK you can use [wiki:Processing/NextMapDEMs Nextmap data] (try running `nextmapdem.sh` in the project directory to do it automatically, otherwise see the [wiki:Processing/NextMapDEMs wiki page]). If it's non-UK, you'll need to use [wiki:Help/LeicaLidarDems LiDAR data] (you may wish to use this anyway if it's available - see [wiki:Processing/CreateTifs here] on how to do this using a script), or failing that [wiki:Processing/SRTMDEMs SRTM 90m data]. Copy it into the dem directory. Once you've done this include the DEM file in the config file by entering "dem=<dem_file_name>" under the DEFAULT section.
    6374
    64 Sync
     75== Problems ==
     76
     77----
     78
     79'''Sync'''
    6580
    6681If you get something in the log file like:
    6782
    68 ** End of POSATT file with NO Specim sync found
    69 ** no sync within 5.00 secs of raw file header time: 49915.66
     83** End of POSATT file with NO Specim sync found[[BR]]
     84**     no sync within 5.00 secs of raw file header time: 49915.66
    7085
    71 then there is no sync info in the nav file for that line and and the range of possible sct values will increase (up to a few seconds). You will need to include 'has_sync = false' in the config file line entry and input a wider range of scts, e.g.
     86then there is no sync info in the nav file for that line and and the range of possible sct values will increase (up to a few seconds). You will need to include 'has_sync = false' in the config file line entry and input a wider range of scts, e.g. 
    7287
    73 [hawk_-11]
    74 ...
    75 ...
    76 ...
    77 has_sync = false
    78 sctend = -1
    79 sctincrement = -0.1
    80 sctstart = 1
     88[hawk_-11] [[BR]]
     89...[[BR]]
     90...[[BR]]
     91...[[BR]]
     92has_sync = false[[BR]]
     93sctend = -1[[BR]]
     94sctincrement = -0.1[[BR]]
     95sctstart = 1[[BR]]
    8196
    82 [hawk_-12]
     97[hawk_-12][[BR]]
    8398...
    8499
    85 A less likely reason for the above output is that the raw header file in question contains invalid GPS times (either start time or end time) and therefore also results in not lying within the times of the .nav file. If both GPS Starting Position and GPS Start Time are missing (accordingly for End Position/End Time) then the best guess is to replace the invalid time/position with the raw header file from the other sensor (eagle/hawk), making sure that it is the correct corresponding flightline. If only one of the two are missing, time or position, then the navigation files should be used to find the missing data, this is easier done when the position is missing (open .gpb file from applanix/extract) but still possible when the time is missing (.gpb again, but will have to track position).
     100A less likely reason for the above output is that the raw header file in question contains invalid GPS times (either start time or end time)
     101and therefore also results in not lying within the times of the .nav file. If both GPS Starting Position and GPS Start Time are missing (accordingly for End Position/End Time)
     102then the best guess is to replace the invalid time/position with the raw header file from the other sensor (eagle/hawk), making sure that it is the correct corresponding flightline. If only one of the two are missing, time or position, then the navigation files should be used to find the missing data, this is easier done when the position
     103is missing (open .gpb file from applanix/extract) but still possible when the time is missing (.gpb again, but will have to track position).
    86104
    87 Turns
     105-----------------------------------------------
     106
     107'''Turns'''
    88108
    89109If you get something in the log file like:
    90110
    91 ** flight line may have a turn in it **
    92 ** heading spread over approximately: 120 degs **
     111** flight line may have a turn in it **[[BR]]
     112** heading spread over approximately:  120 degs **
    93113
    94114** run terminated due to turn **
     
    96116then there is a bend in the line that is too sharp. You need to add a -bend flag to the azcorr arguments for the line entry in the config file, e.g.
    97117
    98 [eagle_-7]
    99 ...
    100 ...
    101 ...
     118[eagle_-7][[BR]]
     119...[[BR]]
     120...[[BR]]
     121...[[BR]]
    102122azgcorr_args_extra = -bend
    103123
    104 [eagle_-8]
     124[eagle_-8][[BR]]
    105125...
    106126
    107127Once processed, it is best to exclude the lines which are causing the problems by including a -l flag in azspec followed by the line numbers delineating the part of the line you want to keep, then reprocessing. E.g. if the bend is at the start of the line:
    108128
    109 [eagle_-7]
    110 ...
    111 ...
    112 ...
    113 azgcorr_args_extra = -bend
     129
     130[eagle_-7][[BR]]
     131...[[BR]]
     132...[[BR]]
     133...[[BR]]
     134azgcorr_args_extra = -bend[[BR]]
    114135azspec_args_extra = -l 50 2000
    115136
    116137will exclude the first 50 lines. Make sure the second value takes into account dark frames: the number of lines given in the header file will include dark frames and if you want to process to the end of the line you should therefore specify the line number before the figure given in 'autodarkstartline' (also in the header file).
    117138
    118 For anything else, check: Common or known problems. This is now a bit out-of-date - if your solution isn't on there then please add it once you find what it is.
    119 Finished? ¶
     139--------------------------------------------------
    120140
    121 Once you're satisfied with the processed data, you need to create a delivery directory for it.
     141For anything else, check: [wiki:Processing/KnownProblems Common or known problems]. This is now a bit out-of-date - if your solution isn't on there then please add it once you find what it is.
     142
     143---------
     144
     145== Finished? ==
     146
     147Once you're satisfied with the processed data, you need to [wiki:Procedures/DeliveryCreation create a delivery directory] for it.