GithubHelp home page GithubHelp logo

Comments (6)

AlexeyPechnikov avatar AlexeyPechnikov commented on June 3, 2024

This is a lazy 3D cube created by your code:

sbas_disps = sbas.open_grids(dates, 'disp', func=[sbas.los_displacement_mm])
sbas_disps_total_ll = sbas.intf_ra2ll(sbas_disps.cumsum('date'))

And you saved it into NetCDF but it cannot be saved to GeoTIFF because it is 3D dataset:

delayed = sbas_disps_total_ll.to_netcdf(filename_nc, engine=sbas.engine, compute=False)
tqdm_dask(dask.persist(delayed), desc='Saving Total Displacement as NetCDF')

That's more effective to rearrange the commands and use loop adding the current raster to total sum instead of cumsum('date') call (it is a computationally expensive operation) when you just need a set of 2D rasters:

disp = sbas.open_grids(dates, 'disp', func=[sbas.los_displacement_mm])

# convert coordinate to valid dates
disp['date'] = pd.to_datetime(disp.date)

# save NetCDF files into separate directory
dirname = f'{WORKDIR}.displacement'
!mkdir -p {dirname}
!rm -f {dirname}/displacement*.nc

# for faster processing add new grid to materialized sum of the previous ones
# do not use .cumsum() function on lazy stack to process all the grids
disp_total = None
for (idx, disp_current) in enumerate(disp):
    date = disp_current.date.dt.strftime('%Y%m%d').item()
    if disp_total is None:
        disp_total = disp_current.compute()
    else:
        disp_total += disp_current.compute()
    #print (idx,date)
    filename = f'{dirname}/displacement.{date}.nc'
    sbas.intf_ra2ll(disp_total).to_netcdf(filename, engine=sbas.engine)
    print ('.', end='')

We can optimize the code, add "sbas.intf_ra2ll" call directly to the sbas.open_grids call (disp_ll = sbas.open_grids(dates, 'disp', func=[sbas.los_displacement_mm, sbas.intf_ra2ll])) and so on. Actually, it does not related to PyGMTSAR itself but to a big data processing on Python and I share the examples on my Patreon: https://www.patreon.com/pechnikov

By the way, what is the reason to produce GeoTIF files when NetCDF files can be opened in almost any GIS software like to GDAL, QGIS, etc.?

from pygmtsar.

benwright151 avatar benwright151 commented on June 3, 2024

Thank you for your detailed suggestion, that definitely produces a result where none existed before. However, I'm a little concerned that the different methods to output the netcdf/ geotiff are causing an offset to occur somehow. I'd carried out an export using the lazy grid methodology on a smaller batch of data and got the following result:
before_shapefile

However, when using the new methodology, I get a completely different output:
after_shapefile

I thought that it may be the coherence stack mask somehow causing less data to be available with a longer time series, however on closer inspection it looks as if the data has "shifted" northwards. See example below:
before_

after

both

As you can see there's an exceedingly similar-looking displacement output at slightly different locations. To answer your other question as to why I am attempting to convert the netCDF to geoTIF within the code instead of after the fact - I believe there was a bug in the version of xarray outside of the docker environment and I had problems utilizing gdal or other geospatial packages to resolve the problem at the time. The conversion of a 3D data stack to GeoTIF seemed to be working perfectly with lower resolution imagery and seems to still work when utilizing a high res 2D array. Is there any particular reason why I should not be doing it this way? Do you have any potential suggestions as to why I am experiencing a shift? Many thanks in advance!

from pygmtsar.

AlexeyPechnikov avatar AlexeyPechnikov commented on June 3, 2024

You might try the example notebook https://colab.research.google.com/drive/1xhVedrIvNS66jGKgS30Dgqy0S31uJ8gm?usp=sharing

and export disp_total.nc. As it uses the OSM mask, the results should fit well with the OSM map in QGIS:

image

from pygmtsar.

benwright151 avatar benwright151 commented on June 3, 2024

I had a look at the example notebooks you mentioned above and couldn't see any differences with the way you're geocoding so attempted to replicate my theory on the exact same small (3 full slc images) batch of data. Both output methodologies produced the exact same data grid which fits nicely with the OpenStreetMap layer available on QGIS.

What I am therefore confused about is how changing the total number of dates processed and the master image of the inSAR dataset can cause a geographic shift in my output result. To save on space I removed 6 months of data so would have ended up with a different master image (as at the moment I am not specifically setting one).

Is this something you have experienced before or do you think it may well be a mistake on my part? I know that the critical baseline is an important concept to understand in inSAR theory, but I didn't believe that this could lead to an actual shift in a resulting displacement raster?

Many thanks in advance for your response!

from pygmtsar.

AlexeyPechnikov avatar AlexeyPechnikov commented on June 3, 2024

Maybe do you use the same geocode matrix for the different master images? For the complete processing I cannot imagine how the shift is possible.

from pygmtsar.

benwright151 avatar benwright151 commented on June 3, 2024

No, everything is calculated independently for each stack of imagery. I'll run some further tests this week and let you know if I can replicate the problem.

from pygmtsar.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.