Custom Query (418 matches)

Filters
 
Or
 
  
 
Columns

Show under each result:


Results (46 - 48 of 418)

Ticket Resolution Summary Owner Reporter
#130 fixed Support: 22/Apr/2008, Ricardo Díaz-Delgado, EUFAR07/01 mggr mggr
Description

Ricardo contacted us yesterday with:

  Recently you offered your help with any general Eagle/Hawk/CASI/ATM processing issues and we would like to consult about our Eagle/Hawk data processed with azgcorr. As you maybe are aware, Doñana (SW Spain) was overflown March last year and we have been using azgcorr to produce Level 3 geocorrected data, and able to apply atmospheric corrections with the great help of Andrew Wilson from CEH. However, to manage the huge files of the geocorrected Level 3 data at full spatial and radiometric resolution is still a pain because of file size, specially in cases in which we have lots of ground-thruth positions in different flight-lines (our main EUFARNet aim and goal is to map 2 allien plant species).

   That is why we are also trying to obtain IGM files for ENVI (X and Y coordinate files that help in overlying ground-truth areas, more info at http://www.ittvis.com/services/techtip.asp?ttid=3816), and this is why we are testing
the possibility to do it with PARGE since AZGCORR has not this option (although the calculations must be embedded somewhere in the code, during the geocorrection process, as Andrew Wilson indicated). This might simplify the process of overlaying ground-truth on flight-lines before applying any geocorrection to the final product (mostly classification maps). Would it be possible to incorporate such an option on AZGCORR processing?

As this is not the only project in which we are working, and we are not experienced hyperspectral data users, it is taking us longer than we would have liked to analyse the data.

The IGM files appear to be ENVI's method for specifying per pixel coordinates. This is the same issue as #109.

Replying to Ricardo with the info we have on this, and the current tentative plan for this to be implemented pending availability of development time. Raising the importance of that issue too.

#42 fixed Support: 20/July/2007, Chris Hecker, WM2006/06 mggr mggr
Description

Contacted by Chris Hecker (also previously asked about azgcorr versions in #38). Full email below, but the key points are:

  1. I want to use the ARSF/NERC LIDAR data that was recorded simultaneously with the hyperspectral data to improved geocorrection results. The data I want to correct are Eagle/Hawk data strips. The LIDAR data comes as point clouds in files like "Str_287.all". Can these point clouds be read directly into azgcorr (as what is referred to as an "external file") or do I need to use another software (e.g. SCOP++) to make regular gridded altitude data first?
  2. For the LIDAR data, I presume that I wont need any geoid - ellipsoid correction parameter, since aircraft navigation data as well as airborne LIDAR are both based on DGPS heights above WGS84 ellipsoid. Is that correct?
  3. I saw that there are a lot of special CASI options mentioned in the manual. Are there any for the SPECIM sensor in this version already and where would I find a list of these options?
  4. Now here comes the tricky one:
    In the help file there is the option -E which allows the preprocessing of the LIDAR data cloud to new coordinates. Since all our basemaps are in European datum 1950, I wanted to pre-process the LIDAR cloud to this datum as well.
    Currently the data is in the standard format (UTM zone 30, WGS84) with xcoodrinate including the UTMzone as a prefix.
    The output datum is again UTM zone 30 with International ellipsoid and the following datum parameters dx=-87 dy=-98 dz=-121. no rotational parameters are known to me.
    
    So I tried the following AZGcorr statement:
    azgcorr -ELpt f1 x -d7 0 -87 -98 -121 0 0 0 0 -mUTMZ 30 -e Str_287.all
    
    Which to my knowledge should have given me a properly refernced data cloud for my UTM / ED50 system. I can see from the coordinates that are displayed during the processing that it is not going OK. The conversion of the x coordinate to the longitude value is already going wrong. I also noticed that the central meridian (in this case -0.0396 but changing with each time azgcorr is run ) doesnt make any sense. Am I doing something wrong or is this a bug? I am running azgcorr v. 4.8.3 on a 486pc-linux system.
    

Full email:

Hi Mike;
I started to work myself through the various manuals and LINUX version help file of azgcorr.
I have some questions regarding the processing. Maybe you can help me out with this. I am sure they (i.e. the first three) are pretty straight forward for you.

A) I want to use the ARSF/NERC LIDAR data that was recorded simultaneously with the hyperspectral data to improved geocorrection results. The data I want to correct are Eagle/Hawk data strips. The LIDAR data comes as point clouds in files like "Str_287.all". Can these point clouds be read directly into azgcorr (as what is referred to as an "external file") or do I need to use another software (e.g. SCOP++) to make regular gridded altitude data first?

B) For the LIDAR data, I presume that I wont need any geoid - ellipsoid correction parameter, since aircraft navigation data as well as airborne LIDAR are both based on DGPS heights above WGS84 ellipsoid. Is that correct?

C) I saw that there are a lot of special CASI options mentioned in the manual. Are there any for the SPECIM sensor in this version already and where would I find a list of these options?

D) Now here comes the tricky one:
In the help file there is the option -E which allows the preprocessing of the LIDAR data cloud to new coordinates. Since all our basemaps are in European datum 1950, I wanted to pre-process the LIDAR cloud to this datum as well.
Currently the data is in the standard format (UTM zone 30, WGS84) with xcoodrinate including the UTMzone as a prefix.
The output datum is again UTM zone 30 with International ellipsoid and the following datum parameters dx=-87 dy=-98 dz=-121. no rotational parameters are known to me.

So I tried the following AZGcorr statement:
azgcorr -ELpt f1 x -d7 0 -87 -98 -121 0 0 0 0 -mUTMZ 30 -e Str_287.all

Which to my knowledge should have given me a properly refernced data cloud for my UTM / ED50 system. I can see from the coordinates that are displayed during the processing that it is not going OK. The conversion of the x coordinate to the longitude value is already going wrong. I also noticed that the central meridian (in this case -0.0396 but changing with each time azgcorr is run ) doesnt make any sense. Am I doing something wrong or is this a bug? I am running azgcorr v. 4.8.3 on a 486pc-linux system.
Below I copy a screendump.

Thanks for the help,

Chris

His dump:

hecker@x7:~/in$ azgcorr -ELpt f1 x -d7 0 -87 -98 -121 0 0 0 0 -mUTMZ 30 -e Str_287.all
 
-----------------------------------------------------------------------
azgcorr  -- ver: 4.8.3-lin Jun 12 2007   (C) Azimuth Systems UK 1996, 2007
 

DEM pre processing...
 
Initialised transform for point cloud to WGS84 geogs...
 
Projection and datum shift details initialised....
Spheroid name: WGS84
  semi-major: 6378137.00000  semi-minor: 6356752.31425  e^2: 0.006694380 1/f: 29
8.257223563
Projection name: UTM
  origin  lat: 0.0000  long (central meridian) : -0.0396
  grid coords at origin  easting: 500000.00  northing: 0.00
  scale factor: 0.99960000  hemisphere: north
Datum shift name: none
 
Initialised transform for point cloud WGS84 geogs to local datum and projection.
..
 
Projection and datum shift details initialised....
Spheroid name: INTERN
  semi-major: 6378388.00000  semi-minor: 6356911.94613  e^2: 0.006722670 1/f: 29
7.000000000
Projection name: UTM
  origin  lat: 0.0000  long (central meridian) : -3.0000
  grid coords at origin  easting: 500000.00  northing: 0.00
  scale factor: 0.99960000  hemisphere: north
Datum shift name: SING
  old spheroid name: INTERN
  semi-major: 6378388.00000  semi-minor: 6356911.94613  e^2: 0.006722670 1/f: 29
7.000000000
  dx: -87.0000 dy: -98.0000 dz: -121.0000 metres  sc: 0.000000  ppm
  rx: 0.0000 ry: 0.0000 rz: 0.0000 secs
 

old x: 593792.77 y: 4095708.38  z: 131.270 lat: 37.0028547 lng: 1.0145276
new x: 857185.96 y: 4102748.87 z: -12.407
old x: 593792.77 y: 4095708.39  z: 131.310 lat: 37.0028548 lng: 1.0145276
new x: 857185.96 y: 4102748.88 z: -12.367
old x: 593792.79 y: 4095707.83  z: 131.270 lat: 37.0028497 lng: 1.0145278
new x: 857186.00 y: 4102748.32 z: -12.407
old x: 593792.79 y: 4095707.84  z: 131.290 lat: 37.0028498 lng: 1.0145278
new x: 857186.00 y: 4102748.33 z: -12.387
old x: 593792.82 y: 4095707.10  z: 131.300 lat: 37.0028432 lng: 1.0145280
new x: 857186.05 y: 4102747.59 z: -12.377
old x: 593792.82 y: 4095707.12  z: 131.340 lat: 37.0028433 lng: 1.0145280
new x: 857186.05 y: 4102747.61 z: -12.337
old x: 593792.91 y: 4095706.78  z: 131.310 lat: 37.0028403 lng: 1.0145290
new x: 857186.15 y: 4102747.27 z: -12.367
old x: 593792.90 y: 4095706.80  z: 131.370 lat: 37.0028404 lng: 1.0145289
new x: 857186.14 y: 4102747.29 z: -12.307
 
file: Str_287.all  converted points: 3910381
 
End of DEM run
#129 fixed Support: 18/Apr/2008, Luke Bateson, BGS07/02 benj benj
Description

Detailed query from Luke Bateson regarding gappy/fuzzy Hawk images and projections. Main culprit seems to be that he's used a small pixel size (1-2m for a flight at ~6000ft), causing interpolation gaps. Fuzziness largely caused (I think) by the fact that he's used bands 5, 3, 2 for Hawk (close together, all at the SW end of the spectrum and at least one seems to have some bad pixels). Also some queries about projections and the geoid-spheroid file.

Responded with answers, though I've said we're trying to clarify the circumstances in which a geoid-spheroid file is needed.

Original email below, with various attachments.

Hi,
 
I recently received data from the ARSF for an area in Italy called Latera. This was flown on day 248 of 2007.
 
I have been familiarizing myself with azgcoor using the example azgcorr commands as given in the read_me.txt that was delivered with the data. The geocorrection is working but I have a few questions:
 
Missing data in output image:

   1. If I try to output an image with 1m pixels the result has a lot of data missing: see attachment (h2408013b_532_test_1mpix.jpg)
   2. If I do the same but with a 2m pixel output image the result is much better but there are still a few missing sections (h2408013b_532_2test.jpg)
   3. The command is based on the example given in the read_me file and uses the supplied SRTM dem. I receive a similar looking result if I use a lidar dem that I have for the area. The command used for the 1m pixel image is:

                     azgcorr -v -mTM 3 0 0 9 1 0 500000 -dNO -eh latera.dem -es sphseplx.grd -bl 5 3 2 -1 -p 1 1 -1 h248011b.hdf -3 h248013b_532test.hdf
 

    * This command is found under 'Third command' heading in the attached text doc. The result screen output can also be seen in the attached TXT document (hawk_geocorrection_v2.txt).
    * Is this missing data a function of me trying to get too small a pixel size or is there another reason?
    * What is the maximum pixel size I can expect with eh Hawk data? - I.e. what would be the maximum I can specify in this process?

The 'colour' and appearance of the image

   1. I notice on the supplied screenshot jpegs (with a 4 meter pixels sixe) (h248013b.jpg) that the image appears to be a much better quality than the images that I have produced. Has any other processing been applied these images before they were corrected (atm etc). I have not yet applied anything else and if so this could be a reason, if not then I presume something has not worked correctly in my processing?

Projections

   1. You will see from the script that I am using TM with a central meridian of 9 and a datum code of 3 (WGS84). Is this a valid way to get the data into an equivalent of UTM zone 32? I tried using the UTMZ commands but this caused a problem with the supplied SRTM DEM not enclosing the image and the -dNO argument did not work either.
   2. The lidar DEM that I am using is in UTM zone 32, I resume that this is ok to use with the -mTM argument as shown int eh above command.

Geoid-spheroid correction

   1. Does the file sphseplx.grd cover all possible geoids spheroid corrections or is it specific to this data?
   2. I presume that I need to use this file since I am projecting into UTM zone 32

Additional Information
 
Attached 'myhdffilelisting' has info for an example processed file with 2m pixels.
 
Processed on a new Linux PC with 222gb free disc space, 4GB ram, 2 Xeon 2GHZ processors.
 
The Linux install information is:
Linux kwx14868.ad.bgs.ac.uk 2.6.18-53.1.13.el5PAE #1 SMP Tue Feb 12 13:33:01 EST 2008 i686 i686 i386 GNU/Linux
 
 
Apologies for such a long email.
 
Thanks
 
Luke Bateson.
Note: See TracQuery for help on using queries.