Version 42 (modified by wja, 10 months ago) (diff)

--

Owl Processing Guide

Details on the Owl sensor are available here and internal notes on Owl testing are available from here (NERC-ARF-DAN only, login required).

Owl Pre-Processing Checks

These are normally done as part of unpacking.

1) Check for dark frames. From the top level directory run:

grep -rnw thermal/owl/*/capture/*.hdr -e "autodarkstartline"

Data cannot be processed without this keyword, so ensure that it is listed for each flightline. If dark frames are missing they can be added following instructions in the Dark Frames section.

2) Check for nav synchronisation:

grep -rnw thermal/owl/*/capture/*.hdr -e "GPS"

If there are no GPS times in the data header, the data cannot be geocorrected. Use owl_hdr_fix.py to add the times from the file modification time and check it is reasonable.

3) Check for calibration data:

ls thermal/owl/*/capture/T1* 
ls thermal/owl/*/capture/T2*

Make sure that there is at least 1 T1 and 1 T2 file present for the flight and that it is located in the same directory as the flight data. If there are double the number of expected flightlines, then the calibration data is probably stored in separate directories from the flight data. The unpacking scripts should move the raw data into ordered number flightline directories and the T1 and T2 files into directories labelled by time.

Note that you will need to generate a config file to determine which calibration data is used for each flightline during radiometric calibration.

4) Check for dropped frames in the .log files. If there are more than a few note on the ticket. Make sure the log file is in the same layout as 2014 288 before starting processing.

5) Check the files and directories are correctly named.

File Naming

Our scripts require that data be named according to the standard convention e.g. OWL219b-14-1, where 219 is the julian day, b the sortie, 14 the year and 1 the flightline. The files should have been renamed during unpacking, but if this is not the case, run owl_rename.py to rename everything.

Dark Frames

The radiometric calibration will fail if the dark frames are saved in a separate file to the data (Ops sometimes do this). Check the capture directory for any extra files labelled like they might be dark frames and append them to the end of the data file. Remember to add the autodarkstartline key to the header file.

Original Owl data should not be modified, nor scripts ran by ARSF in the raw directories. Use the batch script stitch_all_owl.py with a config file to automatically stitch desired files into the processing/owl/flightlines/stitched directory. The config files should be a text file with each flight line and each file to be stitched on a new line separated by a comma e.g:

1, T1_OWL274-15_0844.raw 
2, T1_OWL274-15_0844.raw 
3, T1_OWL274-15_0853.raw

The script stitchOwl.py may be used for individual lines.

Specim's calibration tool actually only uses dark frames to detect blinkers; it does not use them to radiometrically calibrate the data. Therefore, if dark frames are completely missing a calibration file (e.g. T1) may be used instead.

When creating level 1 files remember to use the -s flag to use the files in the processing/owl/flightlines/stitched directory rather than the original raw files.

Radiometric Calibration

Specim have provided a tool to process the Owl raw data into calibrated data. Details on using this tool are located here if required.

To batch process the data use:

batch_cal_owl_proctool.py  -p <project directory>

To process individual lines use:

cal_owl_proctool.py -p <project directory> -f <flightline>

Note: This script does not currently work with Slurm and used the old SGE grid.

Use the -s flag if the raw files did not have dark frames and have been stitched into the processing/owl/flightlines/stitched directory.

Sometimes the instrument is installed backwards and needs "flipping" with the --fliplr flag.

Note: If you are processing data collected in 2023, please refer to the following page for information regarding flipping and the possibility of having multiple boresights in one flight: https://nerc-arf-dan.pml.ac.uk/trac/wiki/Procedures/installation_summary_2023

If T1 and T2 files are in different directories to the raw data, the new directories should have been named with the hour and minute recorded in their hdr. Its format will be OWLjjjs-yy_Thhmm (h: hour and m: minute) An example of this will be:

OWL135-18-1
├── capture
│   ├── DARKREF_OWL135-18-1.hdr
│   ├── DARKREF_OWL135-18-1.log
│   ├── DARKREF_OWL135-18-1.raw
│   ├── OWL135-18-1.hdr
│   ├── OWL135-18-1.log
│   ├── OWL135-18-1.nav
│   └── OWL135-18-1.raw
├── manifest.xml
├── metadata
│   ├── OWL135-18-1.xml
│   └── OWL135-18-1.xsl
└── properties.xml

OWL135-18_T1150/
├── capture
│   ├── T1_OWL135-18_T1150.hdr
│   ├── T1_OWL135-18_T1150.log
│   ├── T1_OWL135-18_T1150.raw
│   ├── T2_OWL135-18_T1150.hdr
│   ├── T2_OWL135-18_T1150.log
│   └── T2_OWL135-18_T1150.raw
├── manifest.xml
├── metadata
│   ├── OWL135-18_T1150.xml
│   └── OWL135-18_T1150.xsl
└── properties.xml

In this case a config file should be used to specify which calibration files have to be used to process each owl flightline. This is formatted like this:

[owl_-1]
black_body = T0853
[owl_-2]
black_body = T0855

Currently this is generated manually, but will soon be automated based on GPS timestamps. The config file is specified using the -t flag.

The config file should be used for missing T1 and T2 files, but an obsolete method used the -c auto option to automatically select adjacent flightline calibration data.

All output files are written to the flight line subdirectory of /processing/owl/flightlines/level1b and logs written to /processing/owl/logfiles. The output files consist of the data (*_proc.bil) and if not specified as inputs, the calibration (*_calibration.rad) and blinker files (*blinkers.dat), each with their own header file. During processing a symlink appears in the output flight line folder to prevent simultaneous processing over the same line. You may need to remove this if processing is aborted before it is automatically removed.

Files named as tp3f35cf3d_7182_4f17_8294_425fa4dc2953.raw are temporary files and will be removed on completion. If they remain check the logs for errors. No autodark found means the files need to be appended with dark frames.

Blinking Pixels/Vertical Stripes

During the radiometric calibration blinking pixels are tested for and removed from the data. It is likely that the default settings will not remove all the blinking pixels and the data will have vertical stripes. To remove these reprocess with a lower value of pixstabi e.g.

batch_cal_owl_proctool.py -p <project directory> -f <flightline> -m pixstabi=0.005

The pixstabi modifier defines the standard deviation of the time series allowed for a normal pixel, so reducing this removes more pixels above this range. It is not expected that all blinking pixels can be removed as the blinking pixel routine does not yet appear to be optimal. Do not make the value too low as this will remove too many pixels and result in thick vertical stripes where the data has been replaced. It will also remove a considerable amount of spectral information, so check that the spectra still look reasonable. Somewhere between the first 2 images below is desirable.

No image "pixstabi238.png" attached to Procedures/OwlProcessing
Level 1b Owl data (2014 238) calibrated with different values of pixstabi (default 0.011, 0.005, 0.003, 0.0025, 0.002)

Dropped Frames

If there are dropped frames they should be recorded in a .log file in the capture directory. Make sure they appear in the level 1b data file as empty lines before geocorrection to map the flight line correctly.

Geocorrection

Create a dem in exactly the same way as for fenix. This should be saved in the processing/hyperspectral/dem directory, not the owl directories, so only 1 copy is required. e.g. from the project directory type create_apl_dem.py --srtm.

Geocorrection is conducted in the same way as for fenix data, using APL. However, if lines need to be split an extra step is required - see "Splitting files" section below. Generate a config file using generate_apl_config.py -o processing/owl/o2014219b.cfg (make sure it begins with an o) to keep processing separate from fenix.

You may need to add the boresight values to the config file ( e.g. owl_boresight = 0.12 -0.32 0.32). You may also want to delete or set to false the fenix lines if only processing owl data. Make sure the project directory structure is complete (same as fenix). Submit to the grid using specim_slurm.py <config_file> (more info in the "Processing on the Slurm Grid" section below) and calculate the SCTs. Once you have done this you can generate the mapped data (102 bands) by setting owl_bandlist = ALL and setting the SCT value for each line.

You will need to check the contents of the config file. You will need to specificy the DEM (SLSTR or ASTER) with the dem_origin key. Also check that the dem key contains the full path to the DEM.

Early owl files have corrupt nav files, so if you get this message in the log or 'Requested sync time index is out of bounds in GetSyncDelay()' set the config file to use_nav = false.

If there appears to be striping across the mapped files, bright and dark lines, or sections of a flight line don't match up the sensor has probably dropped frames. These can be checked in the .log file in the capture directory. These empty lines need to be inserted into the level 1b data file before geocorrection to map the flight line correctly.

generate_apl_config will check the processing/owl/stitched directory for modified raw files to use before defaulting to the original raw directory, so make sure this directory is present only if you intend to use it.

Note: If you are processing data collected in 2023, see here: https://nerc-arf-dan.pml.ac.uk/trac/wiki/Procedures/installation_summary_2023 The owl was configured to chage a changable viewing angle throughout flights, this should be recorded clearly on the flight logs. Each angle requires a differnt boresight. The simpliest way to batch process a whole flight with more than one owl boresight is to add owl_boresight = 0 0 0 (with the correct pitch, roll, yaw values) in each flightline section of the processing config for lines that do not have the nadir (O degree) viewing angle.

Processing on the Slurm Grid

specim_slurm.py will generate a sbatch file (be default into the logfiles directory). This sbatch file is used to submit jobs to Slurm. The sbatch file will run locally if you run it like a bash script. If more than one flightline needs processing (i.e more than one line has process_line = True in the APL config, then the sbatch file is configured to submit an array job (one job containing multiple tasks).

To interact with slurm you need the slurm client installed and configured. It is easier to just ssh to the host rsg-slurm-login-1.

You can submit jobs using `sbatch [PATH TO SBATCH SCRIPT].

Monitor jobs using squeue --me to view all your own jobs. A specific job can be monitored with squeue -j [JOB ID].

By default, array jobs will display as a single job, with additional taks only displaying if they are processing rather than queing, the --array flag will expand this.

Remove a job with scancel [JOB ID]

Splitting files

APL splits files during the radiometric calibration stage, so doesn't actually split owl files as it doesn't run this stage. Files can be split using the splitOwl.py script line by line. You can either specify how to split the lines on the command line or use the apl config file e.g.

splitOwl.py -f (full path)/OWL174-15-2_proc.bil -l 1 25400 25300 50700 50600 74930 or
splitOwl.py -f (full path)/OWL174-15-2_proc.bil -c processing/owl/o2015174.cfg

Then proceed with specim_qsub as normal. If you skip splitting the lines this way the whole line will be processed for every split, resulting in duplicate large files and a very slow network.

To modify the split produced by generate_apl_config use the section_size flag. As a guide, you should divide the owl fps by the fenix fps and multiply by the default (10000). e.g. 100/46 * 10000 = 21740.

Making a delivery

For more information visit the page on creating thermal deliveries

Use the following script from the project directory, specifying owl as the sensor type:

make_arsf_delivery.py --projectlocation $PWD \
                      --deliverytype owl --steps STRUCTURE

If everything looks OK, run with --final

Once the structure has been generated run the other steps using:

make_arsf_delivery.py --projectlocation $PWD \
                      --deliverytype owl --notsteps STRUCTURE

Again pass in --final if the output all looks OK.

Readme

Generate the readme as you would for the fenix (with -r hyper). The correct entries for the owl will be automatically filled in. Expect there to be overflows in the highest bands (near 100) only.

Checking

Follow the Fenix delivery checking procedure. Currently there are no badpixelmethod files generated for the owl.

Also check that detector response is stable across the entire flight by examining the calibration (.rad) files (These are also bil files). If there are dramatic changes in the calibration curves for a particular flight line, make sure that it is not used to calibrate a different flight line. You will also need to double check that the data processed for that flight line look correctly calibrated.

Problems

There are likely to be problems with this newly developed processing chain. Please add problems (and solutions!) below and they will be fixed in due course.

  • The boresight needs entering into the config file / needs adding to table. Should not be a problem for project processing as we will always have a boresight.
  • Bad pixels remain - will eventually look into a masking routine &/ get a better blinker detection routine from Specim.