CASI Reprocessed data

So you have to reprocess data... Let's see if we can help you out with this. Good luck!

Acessing the Data

First of all, if the data is not in our system, download the data from CEDA: https://data.ceda.ac.uk/neodc/arsf/ The password is here: https://rsg.pml.ac.uk/intranet/trac/wiki/Projects/ARSF-DAN/Passwords

You will also need to download the log files and any documentation you can find.

Create project directory and setup

Create a project directory where in appropriate location: </users/rsg/arsf/arsf_data/year/flight_data/country/project>

Choose an appropiate project name <ProjCode-year-_jjjs_Site> and create the project directories as "arsf" user by using build_structure.py -p . Then change to user 'airborne' and create the missing directories. If the year is not found in folder structure, you can build_structure from another year and simply edit manually the directories or move them.

Once the structure is created, copy the hdf files to a new directory called processing/hyperspetral/flightlines/level1b/hdf_files

If there is any project info needed not found in the log. You might be able to get from the hdf files. First of all, get the hdf code activated for you source ~utils/python_venvs/pyhdf/bin/activate

Now you can get extra info from hdf files for example like this: get_extra_hdf_params.py e153081b.hdf

You might need to edit the information for some of the scripts. One example is: --base="Almeria" --navsys="D.Davies" --weather="good" --projcode="WM06_13" --operator="S. J. Rob" --pi="R. Tee" --pilot="C. Joseph"

If the project is not in the database or the processing status page, it is better to include it (just follow the usual steps for unpacking a new project). Otherwise, many of the commands below will need that extra metadata information or might simply fail.

Extracting navigation and setting up

On the next step, you need to create the nav files by extracting the navigation information from the hdf files. Run the following command for each line

bash
source ~utils/python_venvs/pyhdf/bin/activate
for HDF in `find $LEV1_DIR -name *.hdf`
do
  extract_casi_atm_level1_data.sh -H $HDF -O
done

Move the files to the navigation post processed directrory and rename them. Please note that it is the original navigation rather than "_post_processed" but the scripts will be looking for that keyword.

If you have extracted the navigation, then you will be able to automatically create a DEM model by specifying the BIL navigation files directory and running: create_apl_dem.py --aster -b ./processing/hyperspectral/flightlines/navigation/interpolated/post_processed/

And you can create the specim config file by specifying all metadata information:

generate_apl_config.py --base="Sitia" --navsys="D.Davies" --weather="good" --projcode="MC04_15" --operator="S. J. Rob" --pi="R. Tee" --pilot="C. Joseph"  -y 2005 -j 130 --hdfdir ./processing/hyperspectral/flightlines/level1b/hdf_files

Processing flightlines

With the specim config file, you can easily run the processing for each line. First, edit the config file as any other airborne request to add the projection, casi_pixelsize, DEM path... and make sure all information is correct. Then, copy the casi fov files to the directory: processing/hyperspectral/flightlines/sensor_FOV_vectors.

Now you can process each line by running the code: process_casi_atm.py processing/hyperspectral/2005131.cfg --sct 0.00 --gpt 0.00 -s casi -v processing/hyperspectral/flightlines/sensor_FOV_vectors/edited_to_use/casi_fov_vectors.bil -O -X

Remember to activate pyhdf before (see above). Please note that it needs the flag -O for original and -X to skip the aplxml metadata step that will otherwise break. You will need to create the xml files manually at the end.

For creating the xml files, you will need to match all flightlines with their respective log, hdf, level1b and igm and run a command for each one of them. The sample command will be: {{{aplxml.py --meta_type=i --sensor='casi' \ --project_dir="/users/rsg/arsf/arsf_data/2005/flight_data/greece/130_MC04_15/" \ --line_id=-1 --config_file="/users/rsg/arsf/arsf_data/2005/flight_data/greece/130_MC04_15/processing/hyperspectral/2005130_from_hdf.cfg" \ --lev1_file=/users/rsg/arsf/arsf_data/2005/flight_data/greece/130_MC04_15/processing/hyperspectral/flightlines/level1b/c130011b.bil \ --raw_file=/users/rsg/arsf/arsf_data/2005/flight_data/greece/130_MC04_15/processing/hyperspectral/level1b/c130011b.hdf \ --lev1_header=/data/nipigon1/scratch/arsf/2005/flight_data/greece/130_MC04_15/processing/hyperspectral/flightlines/level1b/c130011b.bil.hdr \ --logfile="/users/rsg/arsf/arsf_data/2005/flight_data/greece/130_MC04_15/processing/hyperspectral/logfiles/?_2005-130_casi_-1.o163231503411" \ --output=/data/nipigon1/scratch/arsf/2005/flight_data/greece/130_MC04_15/processing/delivery/MC04_15-2005-130/flightlines/line_information/c130011b.xml \ --sct=0 --maxscanangle=0 --area Sitia --piemail unknown --piname "G Ferrier" --projsummary "-" --projobjective "-" -projcode="MC04_15" \ --igm_header=/users/rsg/arsf/arsf_data/2005/flight_data/greece/130_MC04_15/processing/hyperspectral/flightlines/georeferencing/igm/c130011b_p_sct0.00_utm.igm.hdr }}}

You can also create the xml file for the project information by simply running:

aplxml.py --meta_type p --project_dir /users/rsg/arsf/arsf_data/2005/flight_data/greece/130_MC04_15 --output /users/rsg/arsf/arsf_data/2005/flight_data/greece/130_MC04_15/processing/project_information/MC04_15-2005_130-project.xml --config_file /users/rsg/arsf/arsf_data/2005/flight_data/greece/130_MC04_15/processing/hyperspectral/2005130_from_hdf.cfg --lev1_dir /users/rsg/arsf/arsf_data/2005/flight_data/greece/130_MC04_15/processing/delivery/MC04_15-2005-130/flightlines/level1b/ --igm_dir /users/rsg/arsf/arsf_data/2005/flight_data/greece/130_MC04_15/processing/hyperspectral/flightlines/georeferencing/igm/ --area Sitia --piemail unknown --piname "G Ferrier" --projsummary "-" --projobjective "-" --project_code "MC04_15"

If all steps were succesful and data looks good, you are ready to create a delivery.

Delivery creation

You should create the structure first

make_arsf_delivery.py --projectlocation <insert location and name> \
                      --deliverytype hyperspectral --steps STRUCTURE -c processing/hyperspectral/2005131_from_hdf.cfg 

If the info for this project is not found in the database, you need to specify --projectinfo 'year'=2005 'jday'=131 'projcode'=MC04 'sortie'=None If happy, then run again with --final.

Now, many steps will fail for CASI at one point or another. You should first check the other steps, run first:

make_arsf_delivery.py --projectlocation <insert location and name> \
                      --deliverytype hyperspectral --notsteps STRUCTURE

Inspect the output, and all steps. You can start by moving or copying all the files manually and renaming them accordingly.

Once done, you can run the SCREENSHOTS step by running:

createimage.py --input /users/rsg/arsf/arsf_data/2005/flight_data/greece/130_MC04_15/processing/hyperspectral/flightlines/georeferencing/mapped --output /data/nipigon1/scratch/arsf/2005/flight_data/greece/130_MC04_15/processing/delivery/MC04_15-2005-130/screenshots/ --verbosity 0 --matchonly ^c.*.bil$ --mosaic casi_mosaic --bands 4 3 1

The project information should have been created in the previous section running aplxml. You will be able to undertake the rest of the steps simply by running:

make_arsf_delivery.py --projectlocation <insert location and name> \
                      --deliverytype hyperspectral --steps COPYDOCS ASCIIVIEW 
                      

If everything went according to plan, the delivery should be nearly all done. You are likely to encounter new errors along the processing chain so pay special attention to all steps error messages.

Delivery Readme file

The scripts for creating the readme file will not work for CASI. The easiest way, is to copy another reprocessed CASI text file and edit. You can use as an example projects 130 and 131 from 2005. Edit the tex file and remove all references to the mask files and any other section that does not apply. Complete the data quality section as required, you can use the reprocessed data quality report

Last modified 3 years ago Last modified on Oct 13, 2021, 3:08:06 PM