GithubHelp home page GithubHelp logo

antsimi / py-eddy-tracker Goto Github PK

View Code? Open in Web Editor NEW
113.0 12.0 48.0 23.26 MB

Eddy identification and tracking

Home Page: https://py-eddy-tracker.readthedocs.io/en/latest/

License: GNU General Public License v3.0

Python 67.06% Jupyter Notebook 32.93% Shell 0.01%
mesoscale-eddies ocean mesoscale eddies eddy tracking identification sea-level tourbillon eddy-tracker

py-eddy-tracker's Introduction

PyPI version DOI Documentation Status Gitter Binder pytest

README

How to cite code?

Zenodo provide DOI for each tagged version, all DOI are available here

Method

Method was described in :

Pegliasco, C., Delepoulle, A., Morrow, R., Faugère, Y., and Dibarboure, G.: META3.1exp : A new Global Mesoscale Eddy Trajectories Atlas derived from altimetry, Earth Syst. Sci. Data Discuss.

Mason, E., A. Pascual, and J. C. McWilliams, 2014: A new sea surface height–based code for oceanic mesoscale eddy tracking.

Use case

Method is used in :

Mason, E., A. Pascual, P. Gaube, S.Ruiz, J. Pelegrí, A. Delepoulle, 2017: Subregional characterization of mesoscale eddies across the Brazil-Malvinas Confluence

How do I get set up?

Short story

pip install pyeddytracker

Long story

To avoid problems with installation, use of the virtualenv Python virtual environment is recommended.

Then use pip to install all dependencies (numpy, scipy, matplotlib, netCDF4, ...), e.g.:

pip install numpy scipy netCDF4 matplotlib opencv-python pyyaml pint polygon3

Clone :

git clone https://github.com/AntSimi/py-eddy-tracker

Then run the following to install the eddy tracker :

python setup.py install

Tools gallery

Several examples based on py eddy tracker module are here.

Quick use

EddyId share/nrt_global_allsat_phy_l4_20190223_20190226.nc 20190223 adt ugos vgos longitude latitude ./ -v INFO

for identification, followed by:

EddyTracking tracking.yaml

for tracking (Edit the corresponding yaml files and then run the code).

py-eddy-tracker's People

Contributors

acapet avatar antsimi avatar coripegliasco avatar ctroupin avatar evanmason avatar koldunovn avatar ludwigvonkoopa avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

py-eddy-tracker's Issues

When multiple dates of altimetry datas are combined in a single file

Hi AntSimi,

We have the case were altimetry data are gathered in a single netcdf file, including a "time" dimension for each particular date.
Is there a way to load a particular date within a file with RegularGridDataset ?
Or should we rather split our netcdf file in singular files for each date ?

Thanks !

How to use observation.to_netcdf

I have trouble using the to_netcdf method because I do not know what to put for the handler. After some online research, I think I would need to create a class for the handler (am I looking in the right direction?)

For example, in #7 if I would like to save i_aviso, i_giops, and cost_mat.

    def to_netcdf(self, handler):
        eddy_size = len(self)
        logger.debug('Create Dimensions "obs" : %d', eddy_size)
        self.netcdf_create_dimensions(handler, "obs", eddy_size)
        handler.track_extra_variables = ",".join(self.track_extra_variables)
        if self.track_array_variables != 0:
            self.netcdf_create_dimensions(
                handler, "NbSample", self.track_array_variables
            )
            handler.track_array_variables = self.track_array_variables
            handler.array_variables = ",".join(self.array_variables)
        # Iter on variables to create:
        fields = [field[0] for field in self.observations.dtype.descr]
        fields_ = array(
            [VAR_DESCR[field[0]]["nc_name"] for field in self.observations.dtype.descr]
        )
        i = fields_.argsort()
        for ori_name in array(fields)[i]:
            # Patch for a transition
            name = ori_name
            #
            logger.debug("Create Variable %s", VAR_DESCR[name]["nc_name"])
            self.create_variable(
                handler,
                dict(
                    varname=VAR_DESCR[name]["nc_name"],
                    datatype=VAR_DESCR[name]["output_type"],
                    dimensions=VAR_DESCR[name]["nc_dims"],
                ),
                VAR_DESCR[name]["nc_attr"],
                self.observations[ori_name],
                scale_factor=VAR_DESCR[name].get("scale_factor", None),
                add_offset=VAR_DESCR[name].get("add_offset", None),
            )
        self.set_global_attr_netcdf(handler)

Contours

Hi Antoine,

I am confused with the part how the coutours are selected in the algortithm and stored later as detected anticyclones and cyclones…
So far I have understood how the collection of contours is selected for the observation.
It follows:
collections --> (save) contour_paths --> (test) --> (1) shape_error --> (2) pixel --> (3) amplitude --> (store) obs

I have another question:
I am using OW for the detection instead of SSH
and it confuses me to see the different number of detections with different ranges of OW
for e.g.: the detection of anticyclones and cyclones is different with range:
(a) -4e-09 : -4e-12, (b) -4e-10 : -4e-12

Do you have any clue for this different number of detections with different range of OW?

TrackEddiesObservations and d.to_netcdf 'handler'

I used the loess filter on my radius tracks, but when I tried to save the filtered radii with d.to_netcdf I got this error:

In [5]: %run make_BlackSea_loess_radii.py
--- filtering ...
--- writing to nc ...
---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
/home/users/e/m/emason/BS_code/make_BlackSea_loess_radii.py in <module>
     51     d.loess_filter(half_window=half_window, xfield='time', yfield='radius_s')
     52     print('--- writing to nc ...')
---> 53     d.to_netcdf(filename=filename)

TypeError: to_netcdf() missing 1 required positional argument: 'handler'

I am not sure what the correct argument should be?

I tried also with just d.to_netcdf(filename) but that gave a different error. The previous saving method that worked for me was d.write_netcdf(filename=filename), but that appears to have been changed to to_netcdf. What am I doing wrong?

EddyId

How can the eddy identification be done on regional scale either with python code or .yaml file

wrong labeling in example/02 and pet_sla_and_adt.py

It's just minor correction to be consistent;
when you plot c and a for _adt and/or _sla kwargs call is mixed up:
a_adt.display(ax, **kwargs_a_adt), c_adt.display(ax, **kwargs_a_sla)
think it should be:
a_adt.display(ax, **kwargs_a_adt), c_adt.display(ax, **kwargs_c_adt)

and so on in the rest of file...

Thanks for sharing really cool stuff!

Cheers
Ivica

Quiver plots of eddies' center translation speed

Hello,

I'm trying to use the PET modules to obtain a quiver plot of the translation velocity for eddies' centers.

First, I defined dx_to next and dy_to_next functions, modified from TrackEddiesObservation.distance_to_next

def dx_to_next(s):
    d = distance(
        s.longitude[:-1],
        s.latitude[:-1],
        s.longitude[1:],
        s.latitude[-1:],
        )
    d[s.index_from_track[1:] - 1] = 0
    d_ = np.empty(d.shape[0] + 1, dtype=d.dtype)
    d_[:-1] = d
    d_[-1] = 0
    return d_

def dy_to_next(s):
    d = distance(
        s.longitude[:-1],
        s.latitude[:-1],
        s.longitude[:-1],
        s.latitude[1:],
        )
    d[s.index_from_track[1:] - 1] = 0
    d_ = np.empty(d.shape[0] + 1, dtype=d.dtype)
    d_[:-1] = d
    d_[-1] = 0
    return d_

I thus obtain

a = TrackEddiesObservations.load_file("../out_"+set1+"/Tracks_Overlap/Anticyclonic.nc")
dx=dx_to_next(a)
dy=dy_to_next(a)

Now I'd like to add dx, and dy as other fields, along the other properties such as 'amplitude', etc ..
This in order to use grid_stat, or others afterwards.

But it seems I can't find out how to use add_fields properly to do so.

Any help, or alternative suggestions (I'd still like to know how to use add_fields) ?

Thanks

Time Series Plots

How can the time series analysis be done. It would be great if a sample can be provided as the Anticyclonic.nc nor Cyclonic.nc gives an idea about the time/date.

bug in updated Tracking.py (?)

I have updated my py-eddy-tracker to have all the new examples (thank you a lot for that, it is very useful).
But now, when I run python Tracking.py tracking.yaml, I get the following error message:

YAMLLoadWarning: calling yaml.load() without Loader=... is deprecated, as the default Loader is unsafe. Please read https://msg.pyyaml.org/load for full details.
  config = yaml_load(stream)
Traceback (most recent call last):
  File "/fs/homeu1/eccc/mrd/rpnenv/asf000/data/ords/py-eddy-tracker/build/scripts-3.6/EddyTracking", line 176, in <module>
    CORRESPONDANCES.track()
  File "/home/ords/mrd/rpnenv/asf000/miniconda3/envs/virtenv_eddy/lib/python3.6/site-packages/pyEddyTracker-3.0.0-py3.6.egg/py_eddy_tracker/tracking.py", line 326, in track
    self.swap_dataset(self.datasets[first_dataset - 1])
  File "/home/ords/mrd/rpnenv/asf000/miniconda3/envs/virtenv_eddy/lib/python3.6/site-packages/pyEddyTracker-3.0.0-py3.6.egg/py_eddy_tracker/tracking.py", line 155, in swap_dataset
    self.current_obs = self.class_method.load_file(dataset, raw_data=raw_data)
  File "/home/ords/mrd/rpnenv/asf000/miniconda3/envs/virtenv_eddy/lib/python3.6/site-packages/pyEddyTracker-3.0.0-py3.6.egg/py_eddy_tracker/observations/observation.py", line 419, in load_file
    if filename.endswith(".zarr"):
TypeError: endswith first arg must be bytes or a tuple of bytes, not str

Here is my tracking.yaml file:

DIAGNOSTIC_TYPE: "ADT"

PATHS:
  # Files produces with EddyIdentification
  FILES_PATTERN: /home/asf000/data/ords/Eddy/CONTROLE_kch_latlon0.25_preprocessed/Anticyclonic_*.nc
  # FILES_PATTERN: /home/asf000/data/ords/Eddy/CONTROLE_kch_latlon0.25_preprocessed/Cyclonic_*.nc

  # Path for saving of outputs
  SAVE_DIR: "/home/asf000/data/ords/tracking/minlife28days/CONTROLE_kch_latlon0.25_preprocessed/"

# Minimum number of observations to store eddy
TRACK_DURATION_MIN: 28
VIRTUAL_LENGTH_MAX: 0

Collocate In-situ observations

Hello,

I have a data frame (df) for in-situ profiles with date, lon, lat, and other information.

My current objective is to add columns to this data frame, on the basis of a set of eddy tracks.

  • Is the float within an eddy contour? YES/NO
  • Lon, Lat for the eddy center,
  • Radius and Age (lifetime) of this eddy.

I'm not entirely sure how to address that.

My first approach was to loop on days, subset both the in situ data frame and the eddies collection, and then use the inside function. With the following code, I can determine if a given profile is within any eddy identified for that day.

ta1 = TrackEddiesObservations.load_file("../out_"+set1+"/Tracks_Overlap/Anticyclonic.nc")
dmin=ta1.time.min()
dmax=ta1.time.max()

for d in range(dmin,dmax):
    a1=ta1.extract_with_period((d,d))
    dfsub = df[df["time"]==d]
    for dl in dfsub.iterrows():
        isin = a1.inside(x=np.array([dl[1].longitude], dtype=np.float64), y=np.array([dl[1].latitude], dtype=np.float64))
        if isin:
                ....

However I have difficulties to get the index of the eddy for wich the condition is true, and then access the relevant info.

Should I loop over all eddies in ´a1´ ?
How can this be done?

I guess I just need a few general guidance, in particular regarding indexing through the TrackEddiesObservations objects.

PS: I guess looping on days isn't strictly necessary here, but it was also done for visualization reasons.

Thanks !

longitudes < than 0 and > than 360 in TrackEddiesObservations

Hi @AntSimi

I now succesfully run one year of eddy tracking and was able to get Anticyclonic.nc and Cyclonic.nc files. I load them with:

a = TrackEddiesObservations.load_file("./Anticyclonic.nc")
c = TrackEddiesObservations.load_file("./Cyclonic.nc")

And the result of

print(a.longitude.min())
print(a.longitude.max())

is:

 -14.2
373.8

do I understand it right that it's just a peculiarity of the algorithm, that when the track crosses 0 or 360 it just continues?

Maybe somehow related to this, the plot for global data generates something like this:

fig = plt.figure(figsize=(12, 5))
ax = fig.add_axes((0.05, 0.1, 0.9, 0.9))
ax.set_aspect("equal")
ax.set_xlim(-30, 390), ax.set_ylim(-80, 80)
# a.plot(ax, ref=-10, label="Anticyclonic", color="r", lw=0.1)
c.plot(ax, ref=0, label="Cyclonic", color="b", lw=1.1)
ax.legend()
ax.grid()

image

If I use ref=None everything is fine:

fig = plt.figure(figsize=(12, 5))
ax = fig.add_axes((0.05, 0.1, 0.9, 0.9))
ax.set_aspect("equal")
ax.set_xlim(-30, 390), ax.set_ylim(-80, 80)
# a.plot(ax, ref=-10, label="Anticyclonic", color="r", lw=0.1)
c.plot(ax, ref=None, label="Cyclonic", color="b", lw=1.1)
ax.legend()
ax.grid()

image

Area tracker

Hello,
I have a few questions related to the area_tracker tracking method:
What are the different tracking criterion ?
Is there a criteria on the distance between two observations of the same eddy ?
Is there a criteria on the radius or the area ?
If there is a criteria on the distance, how does the tracker react if there is a missing observation?

Are those criteria dependent of the time step used for the tracking?

I searched for documentation on the Area_tracker but couldn't find any.

Thank you.

Best Regards,
Louise

KeyError: 'n unknown' and how to make yearly a and c

fig = plt.figure()
ax_lifetime = fig.add_axes([0.05, 0.55, 0.4, 0.4])
ax_cum_lifetime = fig.add_axes([0.55, 0.55, 0.4, 0.4])
ax_ratio_lifetime = fig.add_axes([0.05, 0.05, 0.4, 0.4])
ax_ratio_cum_lifetime = fig.add_axes([0.55, 0.05, 0.4, 0.4])

cum_a, bins, _ = ax_cum_lifetime.hist(
    a["n"], histtype="step", bins=arange(0, 800, 1), label="Anticyclonic", color="r"
)
cum_c, bins, _ = ax_cum_lifetime.hist(
    c["n"], histtype="step", bins=arange(0, 800, 1), label="Cyclonic", color="b"
)

a =6 observations and c =7 obervations.

KeyError                                  Traceback (most recent call last)

<ipython-input-83-47cbb47e05d0> in <module>
      6 
      7 cum_a, bins, _ = ax_cum_lifetime.hist(
----> 8     a["n"], histtype="step", bins=arange(0, 800, 1), label="Anticyclonic", color="r"
      9 )
     10 cum_c, bins, _ = ax_cum_lifetime.hist(

~\personal\python_projects\sea_level_gridded\scripts\py-eddy-tracker-master\src\py_eddy_tracker\observations\observation.py in __getitem__(self, attr)
    204         if attr in self.elements:
    205             return self.observations[attr]
--> 206         raise KeyError("%s unknown" % attr)
    207 
    208     @classmethod

KeyError: 'n unknown'

Untracked and Track_too_short

Hi Antoine,

Could you please help me in understanding the difference between:
'untracked' and 'track_too_short' files?…

About program testing

Hi, I tried to test this program(pet-run-a-tracking.py) with my dataset, but this error occurred.
20201009215337

Tracks on land and longitude interval

Hello,

I'm using your eddy tracker in the Mediterranean area over a period of 27 years, and in the North Atlantic area
over a period of 10 year.
I think I've run the detection and tracking steps correctly (or I hope so), but I have two questions:

  1. why do I find tracks passing over the land? How can I avoid it?

Tracks_carto

tracks_longer20weeks

  1. in the inputl netcdf file, longitudes are in the interval [-180°, 180°] , but the detection process generates Anticyclonic and Cyclonic netcdf files with longitudes in [0°, 360°]. How can I display the contours and the tracks in the original [-180°,180°] longitudes system? I'm pretty new to Python and maybe I'm missing something easy...
    For clarity I would like to report the eddies in this pic:

Eddies_not_centered_in_zero

in the area here:

Correct_area

For the first figure (not centered), I'm using the following code:

from matplotlib import pyplot as plt

from py_eddy_tracker import data
from py_eddy_tracker.observations.observation import EddiesObservations
from py_eddy_tracker.dataset.grid import RegularGridDataset

Load detection files

a = EddiesObservations.load_file("out_directory_NA/Anticyclonic_20100101.nc")
c = EddiesObservations.load_file("out_directory_NA/Cyclonic_20100101.nc")

Plot the speed and effective (dashed) contours

fig = plt.figure(figsize=(15, 8))
ax = fig.add_axes((0.05, 0.05, 0.9, 0.9))
ax.set_aspect("equal")
ax.set_xlim(0,360)
ax.set_ylim(20, 90)

g = RegularGridDataset(
    "CMEMS_DATA_NA_2010_2020_dailyzied/20100101.nc", "lon", "lat"
)
m = g.display(ax, "adt", vmin=-0.15, vmax=0.15, cmap="RdBu_r")
a.display(ax, label="Anticyclonic contour", color= "k", lw=1.5)
c.display(ax, label="Cyclonic contour",color= "k", ls = "dashed", lw=1.5)

ax.legend(loc="upper right")

plt.show()`

Thank you very much!

How to do an identification on ROMS unregular grid

We want to do an identification on an ROMS Unregular Grid like ncdump show:

ncdump -h gigatl6_1h_horizontal_sec02000.nc
netcdf gigatl6_1h_horizontal_section.02000 {
dimensions:
	time = UNLIMITED ; // (30 currently)
	xi_rho = 1502 ;
	eta_rho = 1002 ;
	s_w = 4 ;
	xi_u = 1501 ;
	eta_v = 1001 ;
	s_rho = 3 ;
	s_w_old = 2 ;
variables:
	float u(time, s_rho, eta_rho, xi_u) ;
		u:long_name = "u-momentum component" ;
		u:units = "meter second-1" ;
	float ocean_time(time) ;
	float v(time, s_rho, eta_v, xi_rho) ;
		v:long_name = "v-momentum component" ;
		v:units = "meter second-1" ;
	float temp(time, s_rho, eta_rho, xi_rho) ;
		temp:long_name = "potential temperature" ;
		temp:units = "Celsius" ;
	float salt(time, s_rho, eta_rho, xi_rho) ;
		salt:long_name = "salinity" ;
		salt:units = "PSU" ;
	float rho(time, s_rho, eta_rho, xi_rho) ;
		rho:long_name = "density anomaly" ;
		rho:units = "kilogram meter-3" ;
	float zeta(time, eta_rho, xi_rho) ;
		zeta:long_name = "free-surface" ;
		zeta:units = "meter" ;
	float ow(time, s_rho, eta_rho, xi_rho) ;
		ow:long_name = "unknown" ;
		ow:units = "unknown" ;
	float lon(time, eta_rho, xi_rho) ;
	float lat(time, eta_rho, xi_rho) ;
}

This grid can't be process directly by py-eddy-tracker for some reason:

  • Speed units can't be understand by pint, "meter second-1" must be "meter second^-1"
  • u and v doesn't share same dimensions with coordinates variables (eta_rho and xi_rho)

To be able to process this grid we will create a specific class which inherit of UnRegularGridDataset to explain how build speed norm

from datetime import datetime
from py_eddy_tracker import start_logger
from py_eddy_tracker.dataset.grid import UnRegularGridDataset
from numpy import zeros
from netCDF4 import Dataset

class RomsDataset(UnRegularGridDataset):
    
    __slots__ = list()
    
    def init_speed_coef(self, uname, vname):
        # xi_u and eta_v must be specified because this dimension are not use in lon/lat
        u = self.grid(uname, indexs=dict(xi_u=slice(None)))
        v = self.grid(vname, indexs=dict(eta_v=slice(None)))
        u = self.rho_2d(u.T).T
        v = self.rho_2d(v)
        self._speed_norm = (v ** 2 + u ** 2) ** .5

    @staticmethod
    def rho_2d(x):
        """Transformation to have u or v on same grid than h
        """
        M, Lp = x.shape
        new_x = zeros((M + 1, Lp))
        new_x[1:-1] = .5 * (x[:-1] + x[1:])
        new_x[0] = new_x[1]
        new_x[-1] = new_x[-2]
        return new_x


if __name__ == '__main__':
    start_logger().setLevel('DEBUG')
    grid_name = 'gigatl6_1h_horizontal_section.02000.nc'
    lon_name, lat_name = 'lon', 'lat'
    h = RomsDataset(grid_name, lon_name, lat_name, indexs=dict(time=0))
    # Must be set with time of grid
    date = datetime(2019, 2, 23)
    
    # Identification every 2 mm
    a, c = h.eddy_identification('zeta', 'u', 'v', date, 0.02, pixel_limit=(5, 2000), shape_error=55)
    
    with Dataset(date.strftime('Anticyclonic_%Y%m%d.nc'), 'w') as h:
        a.to_netcdf(h)
    with Dataset(date.strftime('Cyclonic_%Y%m%d.nc'), 'w') as h:
        c.to_netcdf(h)

Identification on zeta grid :
zeta
will produce this identification:
identification

To go further :

  • A grid with a masked land is better
  • For a more accurate detection it will be important to remove long scale

Using multiple files or records

Hi,
I would like to use eddy tracker on multiple time daily records (aviso or ROMS model). Tried to modify yaml file but stuck.
For start I downloaded daily values from mercator/copernicus subset of AVISO L4 (sla, adt and all other variables) for 1 month and would like to use the file as an input for eddy identification and tracking. I can do that for single date (as argument in eddy_identification).
Is there a way to specify argument for multiple dates or I have to use yaml file? Do you have some example with multiple time records?

Tried with modified example in ./share/ eddy_identification.yaml but no luck. It looks for FILE_PATERN key which should be generated using DATASET and FILES_MODEL with DATE_REGEXP?

(I even split those daily records into separate daily files and still not working)

Thanks in advance,
Ivica

llvmlite distribution not found

When running python3.8 setup.py install --user I get this error:

Traceback (most recent call last):
  File "/home/acad/ulg-mast/emason/.local/bin/EddyId", line 6, in <module>
    from pkg_resources import load_entry_point
  File "/softs/python/3.8.0/lib/python3.8/site-packages/pkg_resources/__init__.py", line 3251, in <module>
    def _initialize_master_working_set():
  File "/softs/python/3.8.0/lib/python3.8/site-packages/pkg_resources/__init__.py", line 3234, in _call_aside
    f(*args, **kwargs)
  File "/softs/python/3.8.0/lib/python3.8/site-packages/pkg_resources/__init__.py", line 3263, in _initialize_master_working_set
    working_set = WorkingSet._build_master()
  File "/softs/python/3.8.0/lib/python3.8/site-packages/pkg_resources/__init__.py", line 583, in _build_master
    ws.require(__requires__)
  File "/softs/python/3.8.0/lib/python3.8/site-packages/pkg_resources/__init__.py", line 900, in require
    needed = self.resolve(parse_requirements(requirements))
  File "/softs/python/3.8.0/lib/python3.8/site-packages/pkg_resources/__init__.py", line 786, in resolve
    raise DistributionNotFound(req, requirers)
pkg_resources.DistributionNotFound: The 'llvmlite<0.36,>=0.35.0rc3' distribution was not found and is required by numba

Looking here https://pypi.org/simple/llvmlite/ indeed 0.36 is not listed. How should we proceed?

Filtering out first time step

Hello Antoine,
Could you please suggest to me, what could be the best possible way to discard (filter) eddies/tracks existing in the first time step of the analysis period?

Applying Low-Pass Filter through bash EddyID command

Hi !

The use of --cut-wavelength and --order option allows to apply a high pass Bessel filter on altimetry fields before identification when using the bash command EddyID.

Alternatively, through python, Low-pass filter are also available.

I'm using experimental altimetry datasets which are quite noisy at the moment, and the noise lead to a significant decrease (~30%) in the number of identified eddies. To solve this, I'd like to attempt appyling a low-pass filter with a small spatial scale (~5km?).

Is it possible to do so through the bash command EddyID, or do I have to use the python approach ?

Thanks !

beginner issue

hi,i justI just started to study the ocean eddies,I want to know what the two circles of the eddies represent respectively.
1

Match -> ZeroDivisionError

Bug report

Bug summary

Obtained ZeroDivisionError while applying match after Identification + tracking steps achieved with bash.
Code updated (pull + install) this morning.

Code for reproduction

tcC = TrackEddiesObservations.load_file("../out_"+set1+"/Tracks/Cyclonic.nc")
taC = TrackEddiesObservations.load_file("../out_"+set1+"/Tracks/Anticyclonic.nc")

tcE = TrackEddiesObservations.load_file("../out_"+set2+"/Tracks/Cyclonic.nc")
taE = TrackEddiesObservations.load_file("../out_"+set2+"/Tracks/Anticyclonic.nc")

# Get indexs of close observation (based on overlap contour)
ia, ja, costa = taE.match(taC, cmin=0.01)
ic, jc, costc = tcE.match(tcC, cmin=0.01)

Actual outcome

---------------------------------------------------------------------------
ZeroDivisionError                         Traceback (most recent call last)
<ipython-input-80-6939f16149e0> in <module>
      6 
      7 # Get indexs of close observation (based on overlap contour)
----> 8 ia, ja, costa = taE.match(taC, cmin=0.01)
      9 ic, jc, costc = tcE.match(tcC, cmin=0.01)

~/.pyenv/versions/3.8.3/lib/python3.8/site-packages/pyEddyTracker-3.2.0+43.g64ad6c0.dirty-py3.8.egg/py_eddy_tracker/observations/observation.py in match(self, other, method, intern, cmin, **kwargs)
    914                 self[x_name], self[y_name], other[x_name], other[y_name]
    915             )
--> 916             c = vertice_overlap(
    917                 self[x_name][i],
    918                 self[y_name][i],

~/.pyenv/versions/3.8.3/lib/python3.8/site-packages/pyEddyTracker-3.2.0+43.g64ad6c0.dirty-py3.8.egg/py_eddy_tracker/poly.py in vertice_overlap(x0, y0, x1, y1, minimal_area)
    328         # we divide intersection with polygon merging result from 0 to 1
    329         else:
--> 330             cost[i] = intersection / (p0 + p1).area()
    331     return cost
    332 

ZeroDivisionError: float division by zero

Expected outcome

I report it as a bug, because this script was going on well previously.
The same track files are used for other processing (eg. maps of properties, comparison), without problem.

`TypingError` when running `eddy_identification`

This might be a bug in numba, I write here in case others get the same.

Bug report

Running eddy_identification stops with a TypingError

Code for reproduction

date = datetime(2015, 1, 1, 0)
a, c = h.eddy_identification(
    'XE', 'U', 'V', # Variables used for identification
    date, # Date of identification
    0.002, # step between two isolines of detection (m)
    pixel_limit=(5, 2000), # Min and max pixel count for valid contour
    shape_error=55, # Error max (%) between ratio of circle fit and contour
    )

Logging:

	DEBUG 2021-02-15 10:26:04,577 	grid.	grid :
						Load U from /home/ctroupin/out4.nc
	DEBUG 2021-02-15 10:26:04,645 	grid.	grid :
						Load V from /home/ctroupin/out4.nc
INFO 2021-02-15 10:26:04,846 grid.eddy_identification :
					We will apply on step a factor to be coherent with grid : 1.000000
INFO 2021-02-15 10:26:04,855 eddy_feature.__init__ :
					Start computing iso lines
	DEBUG 2021-02-15 10:26:04,881 	eddy_feature.	__init__ :
						X shape : (273,)
	DEBUG 2021-02-15 10:26:04,881 	eddy_feature.	__init__ :
						Y shape : (372,)
	DEBUG 2021-02-15 10:26:04,882 	eddy_feature.	__init__ :
						Z shape : (273, 372)
INFO 2021-02-15 10:26:04,885 eddy_feature.__init__ :
					Start computing iso lines with 48 levels from -0.064000 to 0.030000 ...
INFO 2021-02-15 10:26:04,992 eddy_feature.__init__ :
					Finish computing iso lines
INFO 2021-02-15 10:26:05,016 eddy_feature.__init__ :
					Repair 106 closed contours and 36 almost closed contours / 816 contours
	DEBUG 2021-02-15 10:26:05,028 	grid.	eddy_identification :
						doing collection 2, contour value -0.0600, 5 paths

Stack trace:

---------------------------------------------------------------------------
TypingError                               Traceback (most recent call last)
<ipython-input-4-e1bf53f7deeb> in <module>
      1 date = datetime(2015, 1, 1, 0)
----> 2 a, c = h.eddy_identification(
      3     'XE', 'U', 'V', # Variables used for identification
      4     date, # Date of identification
      5     0.002, # step between two isolines of detection (m)

~/Software/PythonEnvs/EddyMitchell/lib/python3.8/site-packages/pyEddyTracker-3.1.0+46.g0b64a4d-py3.8.egg/py_eddy_tracker/dataset/grid.py in eddy_identification(self, grid_height, uname, vname, date, step, shape_error, sampling, pixel_limit, precision, force_height_unit, force_speed_unit)
    686                     # Get indices of centroid
    687                     # Give only 1D array of lon and lat not 2D data
--> 688                     i_x, i_y = self.nearest_grd_indice(centlon_e, centlat_e)
    689                     i_x = self.normalize_x_indice(i_x)
    690 

~/Software/PythonEnvs/EddyMitchell/lib/python3.8/site-packages/pyEddyTracker-3.1.0+46.g0b64a4d-py3.8.egg/py_eddy_tracker/dataset/grid.py in nearest_grd_indice(self, x, y)
   1167 
   1168     def nearest_grd_indice(self, x, y):
-> 1169         return _nearest_grd_indice(
   1170             x, y, self.x_bounds, self.y_bounds, self.xstep, self.ystep
   1171         )

~/Software/PythonEnvs/EddyMitchell/lib/python3.8/site-packages/numba-0.51.1-py3.8-linux-x86_64.egg/numba/core/dispatcher.py in _compile_for_args(self, *args, **kws)
    413                 e.patch_message(msg)
    414 
--> 415             error_rewrite(e, 'typing')
    416         except errors.UnsupportedError as e:
    417             # Something unsupported is present in the user code, add help info

~/Software/PythonEnvs/EddyMitchell/lib/python3.8/site-packages/numba-0.51.1-py3.8-linux-x86_64.egg/numba/core/dispatcher.py in error_rewrite(e, issue_type)
    356                 raise e
    357             else:
--> 358                 reraise(type(e), e, None)
    359 
    360         argtypes = []

~/Software/PythonEnvs/EddyMitchell/lib/python3.8/site-packages/numba-0.51.1-py3.8-linux-x86_64.egg/numba/core/utils.py in reraise(tp, value, tb)
     78         value = tp()
     79     if value.__traceback__ is not tb:
---> 80         raise value.with_traceback(tb)
     81     raise value
     82 

TypingError: Failed in nopython mode pipeline (step: nopython frontend)
No implementation of function Function(<built-in function round>) found for signature:
 
 >>> round(array(float64, 1d, C))
 
There are 2 candidate implementations:
   - Of which 2 did not match due to:
   Type Restricted Function in function 'round': File: unknown: Line unknown.
     With argument(s): '(array(float64, 1d, C))':
    No match for registered cases:
     * (float32,) -> int64
     * (float64,) -> int64
     * (float32, int64) -> float32
     * (float64, int64) -> float64

During: resolving callee type: Function(<built-in function round>)
During: typing of call at /home/ctroupin/Software/PythonEnvs/EddyMitchell/lib/python3.8/site-packages/pyEddyTracker-3.1.0+46.g0b64a4d-py3.8.egg/py_eddy_tracker/dataset/grid.py (1842)


File "../../../../Software/PythonEnvs/EddyMitchell/lib/python3.8/site-packages/pyEddyTracker-3.1.0+46.g0b64a4d-py3.8.egg/py_eddy_tracker/dataset/grid.py", line 1842:
def _nearest_grd_indice(x, y, x0, y0, xstep, ystep):
    <source elided>
    return (
        numba_types.int32(round(((x - x0[0]) % 360.0) / xstep)),

Unit error block EddyId

When EddyId ran he raise an error when current are loaded:

Traceback (most recent call last):
  File "/home/adelepoulle/.conda/envs/simi/bin/EddyId", line 4, in <module>
    __import__('pkg_resources').run_script('pyEddyTracker==3.0.0', 'EddyId')
  File "/home/adelepoulle/.conda/envs/simi/lib/python3.7/site-packages/pkg_resources/__init__.py", line 666, in run_script
    self.require(requires)[0].run_script(script_name, ns)
  File "/home/adelepoulle/.conda/envs/simi/lib/python3.7/site-packages/pkg_resources/__init__.py", line 1462, in run_script
    exec(code, namespace, namespace)
  File "/home/adelepoulle/.conda/envs/simi/lib/python3.7/site-packages/pyEddyTracker-3.0.0-py3.7.egg/EGG-INFO/scripts/EddyId", line 45, in <module>
    shape_error=args.fir_errmax)
  File "/home/adelepoulle/.conda/envs/simi/lib/python3.7/site-packages/pyEddyTracker-3.0.0-py3.7.egg/py_eddy_tracker/dataset/grid.py", line 656, in eddy_identification
    in_u_units = units.parse_expression(self.units(uname))
  File "/home/adelepoulle/.conda/envs/simi/lib/python3.7/site-packages/Pint-0.9-py3.7.egg/pint/registry.py", line 865, in parse_expression
    return build_eval_tree(gen).evaluate(lambda x: self._eval_token(x,
  File "/home/adelepoulle/.conda/envs/simi/lib/python3.7/site-packages/Pint-0.9-py3.7.egg/pint/pint_eval.py", line 85, in evaluate
    return bin_op[op_text](left, self.right.evaluate(define_op, bin_op, un_op))
  File "/home/adelepoulle/.conda/envs/simi/lib/python3.7/site-packages/Pint-0.9-py3.7.egg/pint/quantity.py", line 765, in __sub__
    return self._add_sub(other, operator.sub)
  File "/home/adelepoulle/.conda/envs/simi/lib/python3.7/site-packages/Pint-0.9-py3.7.egg/pint/quantity.py", line 75, in wrapped
    result = f(self, *args, **kwargs)
  File "/home/adelepoulle/.conda/envs/simi/lib/python3.7/site-packages/Pint-0.9-py3.7.egg/pint/quantity.py", line 665, in _add_sub
    raise DimensionalityError(self._units, 'dimensionless')
pint.errors.DimensionalityError: Cannot convert from 'meter * second' to 'dimensionless'

It seem to come from units definition:

short uo(time, depth, latitude, longitude) ;                                                                                                                                                                                                                           
    uo:long_name = "Eastward velocity" ;                                                                                                                                                                                                                           
    uo:standard_name = "eastward_sea_water_velocity" ;                                                                                                                                                                                                             
    uo:units = "m s-1" ;                                                                                                                                                                                                                                           
    uo:unit_long = "Meters per second" ;                                                                                                                                                                                                                           
    uo:_FillValue = -32767s ;                                                                                                                                                                                                                                      
    uo:add_offset = 0. ;                                                                                                                                                                                                                                           
    uo:scale_factor = 0.000610370188951492 ;                                                                                                                                                                                                                       
    uo:cell_methods = "area: mean" ;                                                                                                                                                                                                                               
    uo:_ChunkSizes = 1, 7, 341, 720 ; 

pint don't understand this units

 from pint import UnitRegistry
 units = UnitRegistry()
 units.parse_expression('m/s')
<Quantity(1.0, 'meter / second')>
 units.parse_expression('m s-1')
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/home/adelepoulle/.conda/envs/simi/lib/python3.7/site-packages/Pint-0.9-py3.7.egg/pint/registry.py", line 865, in parse_expression
    return build_eval_tree(gen).evaluate(lambda x: self._eval_token(x,
  File "/home/adelepoulle/.conda/envs/simi/lib/python3.7/site-packages/Pint-0.9-py3.7.egg/pint/pint_eval.py", line 85, in evaluate
    return bin_op[op_text](left, self.right.evaluate(define_op, bin_op, un_op))
  File "/home/adelepoulle/.conda/envs/simi/lib/python3.7/site-packages/Pint-0.9-py3.7.egg/pint/quantity.py", line 765, in __sub__
    return self._add_sub(other, operator.sub)
  File "/home/adelepoulle/.conda/envs/simi/lib/python3.7/site-packages/Pint-0.9-py3.7.egg/pint/quantity.py", line 75, in wrapped
    result = f(self, *args, **kwargs)
  File "/home/adelepoulle/.conda/envs/simi/lib/python3.7/site-packages/Pint-0.9-py3.7.egg/pint/quantity.py", line 665, in _add_sub
    raise DimensionalityError(self._units, 'dimensionless')
pint.errors.DimensionalityError: Cannot convert from 'meter * second' to 'dimensionless'

Radial speed profile

Hello Antoine,

Could you please explain what is the definition of 'uavg_profile'?

Eddy tracking

Dear Antoine,

I have a few questions related to eddy tracking:
What is the difference between different tracking criterion: default and area tracker?

And as we have an irregular dataset, then do you think it makes much difference when we do the tracking using either of the 2 tracking schemes?

I have attached our tracking result using the default tracker, where our file contains data of 5 days with a minimum track duration of 3 days.
where in the figure (attached),
the filled contours are the contours of vorticity (vrt_2) in the background and the contour lines are the detections using py-eddy-tracker.
Could you please tell us why we see (in the attached figure) the eddy in box 1 in both 'untracked' and 'tracked' files?
and same for box2 and box3.

Thank you.

Best Regards,
Ashwita

tracking_at_1500mDepth

UnRegularGridDataset error: 'AttributeError: 'ModulePassManager' object has no attribute 'add_loop_rotate_pass''

The file make_BlackSea_eddies.py iterates over a time series of SSH,U,V data and saves the eddy properties. When I run this script inside an IPython session it works fine, but when used at the terminal it produces the error below:

(VENV) [[email protected] BS_code]$ python make_BlackSea_eddies.py
Traceback (most recent call last):
  File "make_BlackSea_eddies.py", line 10, in <module>
    from py_eddy_tracker.dataset.grid import UnRegularGridDataset
  File "/home/ulg/mast/emason/VENV/lib/python3.7/site-packages/py_eddy_tracker/dataset/grid.py", line 11, in <module>
    from numba import njit
  File "/home/ulg/mast/emason/VENV/lib/python3.7/site-packages/numba-0.52.0-py3.7-linux-x86_64.egg/numba/__init__.py", line 212, in <module>
    import numba.typed
  File "/home/ulg/mast/emason/VENV/lib/python3.7/site-packages/numba-0.52.0-py3.7-linux-x86_64.egg/numba/typed/__init__.py", line 1, in <module>
    from .typeddict import Dict
  File "/home/ulg/mast/emason/VENV/lib/python3.7/site-packages/numba-0.52.0-py3.7-linux-x86_64.egg/numba/typed/typeddict.py", line 22, in <module>
    @njit
  File "/home/ulg/mast/emason/VENV/lib/python3.7/site-packages/numba-0.52.0-py3.7-linux-x86_64.egg/numba/core/decorators.py", line 260, in njit
    return jit(*args, **kws)
  File "/home/ulg/mast/emason/VENV/lib/python3.7/site-packages/numba-0.52.0-py3.7-linux-x86_64.egg/numba/core/decorators.py", line 183, in jit
    return wrapper(pyfunc)
  File "/home/ulg/mast/emason/VENV/lib/python3.7/site-packages/numba-0.52.0-py3.7-linux-x86_64.egg/numba/core/decorators.py", line 212, in wrapper
    **dispatcher_args)
  File "/home/ulg/mast/emason/VENV/lib/python3.7/site-packages/numba-0.52.0-py3.7-linux-x86_64.egg/numba/core/dispatcher.py", line 746, in __init__
    self.targetctx = self.targetdescr.target_context
  File "/home/ulg/mast/emason/VENV/lib/python3.7/site-packages/numba-0.52.0-py3.7-linux-x86_64.egg/numba/core/registry.py", line 47, in target_context
    return self._toplevel_target_context
  File "/home/ulg/mast/emason/VENV/lib/python3.7/site-packages/numba-0.52.0-py3.7-linux-x86_64.egg/numba/core/utils.py", line 277, in __get__
    res = instance.__dict__[self.name] = self.func(instance)
  File "/home/ulg/mast/emason/VENV/lib/python3.7/site-packages/numba-0.52.0-py3.7-linux-x86_64.egg/numba/core/registry.py", line 31, in _toplevel_target_context
    return cpu.CPUContext(self.typing_context)
  File "/home/ulg/mast/emason/VENV/lib/python3.7/site-packages/numba-0.52.0-py3.7-linux-x86_64.egg/numba/core/base.py", line 259, in __init__
    self.init()
  File "/home/ulg/mast/emason/VENV/lib/python3.7/site-packages/numba-0.52.0-py3.7-linux-x86_64.egg/numba/core/compiler_lock.py", line 32, in _acquire_compile_lock
    return func(*args, **kwargs)
  File "/home/ulg/mast/emason/VENV/lib/python3.7/site-packages/numba-0.52.0-py3.7-linux-x86_64.egg/numba/core/cpu.py", line 47, in init
    self._internal_codegen = codegen.JITCPUCodegen("numba.exec")
  File "/home/ulg/mast/emason/VENV/lib/python3.7/site-packages/numba-0.52.0-py3.7-linux-x86_64.egg/numba/core/codegen.py", line 1086, in __init__
    self._init(self._llvm_module)
  File "/home/ulg/mast/emason/VENV/lib/python3.7/site-packages/numba-0.52.0-py3.7-linux-x86_64.egg/numba/core/codegen.py", line 1108, in _init
    cost="cheap")
  File "/home/ulg/mast/emason/VENV/lib/python3.7/site-packages/numba-0.52.0-py3.7-linux-x86_64.egg/numba/core/codegen.py", line 1147, in _module_pass_manager
    pm.add_loop_rotate_pass()
AttributeError: 'ModulePassManager' object has no attribute 'add_loop_rotate_pass'
(VENV) [[email protected] BS_code]$

The error is somewhere inside numba. Any suggestions what causes this problem?

Dataset for tracking

Now i have the daily Anticyclonic_.nc and Cyclonic_.nc ,how should i make the dataset for tracking?
Or whether i can use a long time series of data to track eddies ?

ValueError when applying filtering and eddy_identification

Hello,
I'm trying to apply the eddy tracking to numerical model outputs, but the start is not so easy.
Based on the example here https://py-eddy-tracker.readthedocs.io/en/stable/grid_identification.html#python-code, I created the grid:

h = RegularGridDataset(grid_name, lon_name, lat_name)

(with the needed substitutions).
Issue1: Then I tried the filtering:

h.bessel_high_filter('XE', 500, order=3)

where XE is the name of the variable storing the sea_surface_height_above_sea_level. I get this error:

ValueError                                Traceback (most recent call last)
<ipython-input-28-abe544c8a678> in <module>
----> 1 h.bessel_high_filter('XE', 500, order=3)

~/Software/PythonEnvs/EddyMitchell/lib/python3.8/site-packages/pyEddyTracker-3.1.0+46.g0b64a4d-py3.8.egg/py_eddy_tracker/dataset/grid.py in bessel_high_filter(self, grid_name, wave_length, order, lat_max, **kwargs)
   1422             dict(wave_length=wave_length, order=order),
   1423         )
-> 1424         data_out = self.convolve_filter_with_dynamic_kernel(
   1425             grid_name,
   1426             self.kernel_bessel,

~/Software/PythonEnvs/EddyMitchell/lib/python3.8/site-packages/pyEddyTracker-3.1.0+46.g0b64a4d-py3.8.egg/py_eddy_tracker/dataset/grid.py in convolve_filter_with_dynamic_kernel(self, grid, kernel_func, lat_max, extend, **kwargs_func)
   1314 
   1315         for i, lat in enumerate(self.y_c):
-> 1316             if abs(lat) > lat_max or data[:, i].mask.all():
   1317                 data_out.mask[:, i] = True
   1318                 continue

ValueError: The truth value of an array with more than one element is ambiguous. Use a.any() or a.all()

which is not so clear to me. According to the method help/docstring:

Signature: h.bessel_high_filter(grid_name, wave_length, order=1, lat_max=85, **kwargs)
Docstring: <no docstring>
File:      ~/Software/PythonEnvs/EddyMitchell/lib/python3.8/site-packages/pyEddyTracker-3.1.0+46.g0b64a4d-py3.8.egg/py_eddy_tracker/dataset/grid.py
Type:      method

so I would understand that the 1st argument is the name of the grid (grid_name), but that doesn't seem to correspond to the documentation.

Issue 2: let's say I don't need the filtering, I want to proceed directly to the identification, following the same example as before.

date = datetime(2015, 1, 1, 12, 0, 0)
a, c = h.eddy_identification(
    'XE', 'U', 'V', # Variables used for identification
    date, # Date of identification
    0.002, # step between two isolines of detection (m)
    pixel_limit=(5, 2000), # Min and max pixel count for valid contour
    shape_error=55, # Error max (%) between ratio of circle fit and contour
    )

this also returns a ValueError:

ValueError: The truth value of an array with more than one element is ambiguous. Use a.any() or a.all()

So maybe my issue is how to specify which time step I want to process in the netCDF file. I though that using

date = datetime(2015, 1, 1, 12, 0, 0)

and using this argument in the call of eddy_identification. I cannot provide a full file (6G per file) but can provide the result of ncdump -h if that helps.

Thanks

Collocated External Data.

Hi !

I'm trying to load external data (SST) using the same data structure for collocation (eg. assess mean SST anomalies within contours).

I think I'm almost there, just miss the interpolation step.

datest = '20160721'
SLAfilename = "../l4_"+set1+"/dt_blacksea_allsat_phy_l4_"+datest+"_"+dat1+".nc"
SSTfilename = "../SST/"+datest+"000000-GOS-L4_GHRSST-SSTfnd-OISST_HR_REP-BLK-v02.0-fv01.0.nc4" 

g = RegularGridDataset(SLAfilename, x_name= 'longitude',y_name= 'latitude') 
t = RegularGridDataset(filename=SSTfilename, x_name="lon", y_name="lat") 
t.grid('analysed_sst')

So far, so good. Then I'd like to use :

g.add_grid('sst',ti)

with

ti=t.interp('analysed_sst',g.grid(lon_name),g.grid(lat_name))

but something is wrong.

Among what I noticed :

  • As above, ti is an array, not a masked array. Could that make an issue for add_grid ?
  • ti does not have the good shape ti.shape=(120,), instead of g.grid('sla').shape=(120,56). Should I use a meshgrid of g.grid(lon_name),g.grid(lat_name) before calling interp ?
  • Anyway, I wonder if I'm using the right tool to interpolate, or if there is another one, more specific for interpolating from one regular grid to another.

Thanks for insights !!

Art

Numba/nopython error when using grid.count() function

Hi, I get this error when using grid.count() function on a TrackEddiesObsevations object. I wanted to reproduced the Birth and Death example with my dataset. It looks like every other method works fine, but grid.count() gives this error.

Unfortunately I cannot attach the input file (netcdf format is not supported).
Here is the code:

from py_eddy_tracker import start_logger
from py_eddy_tracker.dataset.grid import UnRegularGridDataset
from py_eddy_tracker.observations.tracking import TrackEddiesObservations

cyc = TrackEddiesObservations.load_file("Cyclonic.nc")
t0, t1 = cyc.period
step = 0.5
bins = ((0, 360, step), (-75, -60, step))
kwargs = dict(cmap="terrain_r", factor=100 / (t1 - t0), name="count", vmin=0, vmax=1)
g_c_first = cyc.first_obs().grid_count(bins, intern=True)

I get this error:

> ---------------------------------------------------------------------------
> TypingError                               Traceback (most recent call last)
> <ipython-input-2-6b0b02b82e27> in <module>()
>       4 kwargs = dict(cmap="terrain_r", factor=100 / (t1 - t0), name="count", vmin=0, vmax=1)
>       5 
> ----> 6 g_c_first = cyc.first_obs().grid_count(bins, intern=True)
> 
> ~/.conda/envs/my_root/lib/python3.6/site-packages/pyEddyTracker-3.3.0+18.g40ef190-py3.6.egg/py_eddy_tracker/observations/observation.py in grid_count(self, bins, intern, center, filter)
>    1971                 regular_grid.x_size,
>    1972                 regular_grid.x_c,
> -> 1973                 regular_grid.y_c,
>    1974             )
>    1975             grid.mask = grid == 0
> 
> ~/.conda/envs/my_root/lib/python3.6/site-packages/numba/dispatcher.py in _compile_for_args(self, *args, **kws)
>     399                 e.patch_message(msg)
>     400 
> --> 401             error_rewrite(e, 'typing')
>     402         except errors.UnsupportedError as e:
>     403             # Something unsupported is present in the user code, add help info
> 
> ~/.conda/envs/my_root/lib/python3.6/site-packages/numba/dispatcher.py in error_rewrite(e, issue_type)
>     342                 raise e
>     343             else:
> --> 344                 reraise(type(e), e, None)
>     345 
>     346         argtypes = []
> 
> ~/.conda/envs/my_root/lib/python3.6/site-packages/numba/six.py in reraise(tp, value, tb)
>     666             value = tp()
>     667         if value.__traceback__ is not tb:
> --> 668             raise value.with_traceback(tb)
>     669         raise value
>     670 
> 
> TypingError: Failed in nopython mode pipeline (step: nopython frontend)
> Cannot unify array(float32, 1d, A) and array(float64, 1d, C) for 'x_', defined at /usr/home/mauger/.conda/envs/my_root/lib/python3.6/site-packages/pyEddyTracker-3.3.0+18.g40ef190-py3.6.egg/py_eddy_tracker/observations/observation.py (2166)
> 
> File "../../../../../../../../usr/home/mauger/.conda/envs/my_root/lib/python3.6/site-packages/pyEddyTracker-3.3.0+18.g40ef190-py3.6.egg/py_eddy_tracker/observations/observation.py", line 2166:
> def grid_count_pixel_in(
>     <source elided>
>     for i_ in range(nb):
>         x_, y_, x_ref_ = x[i_], y[i_], x_ref[i_]
>         ^
> 
> [1] During: typing of assignment at /usr/home/mauger/.conda/envs/my_root/lib/python3.6/site-packages/pyEddyTracker-3.3.0+18.g40ef190-py3.6.egg/py_eddy_tracker/observations/observation.py (2167)
> 
> File "../../../../../../../../usr/home/mauger/.conda/envs/my_root/lib/python3.6/site-packages/pyEddyTracker-3.3.0+18.g40ef190-py3.6.egg/py_eddy_tracker/observations/observation.py", line 2167:
> def grid_count_pixel_in(
>     <source elided>
>         x_, y_, x_ref_ = x[i_], y[i_], x_ref[i_]
>         x_ = (x_ - x_ref_) % 360 + x_ref_
>         ^
> 
> This is not usually a problem with Numba itself but instead often caused by
> the use of unsupported features or an issue in resolving types.
> 
> To see Python/NumPy features supported by the latest release of Numba visit:
> http://numba.pydata.org/numba-doc/latest/reference/pysupported.html
> and
> http://numba.pydata.org/numba-doc/latest/reference/numpysupported.html
> 
> For more information about typing errors and how to debug them visit:
> http://numba.pydata.org/numba-doc/latest/user/troubleshoot.html#my-code-doesn-t-compile
> 
> If you think your code should work with Numba, please report the error message
> and traceback, along with a minimal reproducer at:
> https://github.com/numba/numba/issues/new

Thanks !

How to loop on time in a file?

Hello,

I am working on netCDF files that have different times inside the file, so the situation is slightly different from the AVISO file (one time per file). How can we specify which time index we want to process?

The SSH variable is as follows:

short XE(time, nj, ni) ;
		XE:long_name = "sea surface height" ;
		XE:standard_name = "sea_surface_height_above_sea_level" ;
		XE:units = "m" ;
		XE:coordinates = "latitude longitude" ;
		XE:scale_factor = 0.001525925f ;
		XE:add_offset = 0.f ;
		XE:valid_min = -32767s ;
		XE:valid_max = 32767s ;
		XE:_FillValue = -32768s ;

and the time:

	double time(time) ;
		time:long_name = "time in seconds (UT)" ;
		time:standard_name = "time" ;
		time:units = "seconds since 1900-01-01T00:00:00Z" ;
		time:axis = "T" ;
		time:time_origin = "01-JAN-1900 00:00:00" ;
		time:conventions = "relative number of seconds with no decimal part" ;

To perform an identification, I use:

date = datetime(2015, 1, 1, 0)
a, c = h.eddy_identification('XE', 'Uvel', 'Vvel', date, 0.002, pixel_limit=(5, 2000), shape_error=55)

Probably you already have an example but I cannot find it.

No matching distribution found for opencv

Hello,

I'm trying to install the eddy tracker and I'm using a virtual environment.
It seems

pip install opencv

should be instead

 pip install opencv-python

Can you check? Thanks.

eddy tracking

i just running the code‘’EddyTracking tracking.yaml‘’ ,there is an error occurred.
2020-10-05 10-32-06屏幕截图

How to do an identification on ORCA025 irregular grid

The file attached contains the sea level anomalies ('sla') and the geostrophic velocities ('u','v'). The coordinates lat/lon have two dimensions. Note also that the grid is tripolar.

Because of the tri-polarity of the grid, we add the following line of code:
if centlat_e > 60: continue
to the eddy_identification function (on line 582) so that it skip identifying eddies too close to the north poles.

However, we get an out-of-bound index error message on line 80 of eddy_feature, i.e. for the init of the class Amplituder. IndexError: index 1442 is out of bounds for axis 1 with size 1442

Note that for small domain (where less than 1442 eddies are identified) the code successfully identifies eddies.

image

image

GIOPS_NativeGrid_20200424.zip

EddiesObservations cannot be imported from py_eddy_tracker.observations

The Wiki needs to be updated.

The section for creating a tracking recipe presently fails as follows:

In [7]: from py_eddy_tracker.observations import EddiesObservations as Model                                                                                                                                      
---------------------------------------------------------------------------
ImportError                               Traceback (most recent call last)
<ipython-input-7-77527e386807> in <module>
----> 1 from py_eddy_tracker.observations import EddiesObservations as Model

ImportError: cannot import name 'EddiesObservations' from 'py_eddy_tracker.observations' (/home/emason/VENVP3/lib/python3.7/site-packages/pyEddyTracker-3.0.0-py3.7.egg/py_eddy_tracker/observations/__init__.py)

This seems to work:
In [2]: from py_eddy_tracker.observations.observation import EddiesObservations as Model but I then get the following when trying to follow the Wiki:

(VENVP3) [emason@marula global4eddytracking]$ EddyTracking my_tracking.py my_tracking.yaml
usage: Tool to use identification step to compute tracking [-h]
                                                           [-v LOGGING_LEVEL]
                                                           [--correspondance_in CORRESPONDANCE_IN]
                                                           [--correspondance_out CORRESPONDANCE_OUT]
                                                           [--save_correspondance_and_stop]
                                                           [--blank_period BLANK_PERIOD]
                                                           yaml_file
Tool to use identification step to compute tracking: error: unrecognized arguments: my_tracking.yaml
(VENVP3) [emason@marula global4eddytracking]$

Install without root access

Hi,

Is there a specific argument to provide with the command

python setup.py install

in the case where we do not have root access ? (

EddyID error

I run this:
EddyId -v DEBUG /home/sample/SSH_UV_surface_20140428.nc 20140428 sossheig vozocrtx vomecrty lon lat .

but get the error:

DEBUG 2020-11-04 13:12:51,135 	grid.	eddy_identification :
						doing collection 1015, contour value 0.0080, 5 paths
Traceback (most recent call last):
  File "/home/emason/VENVP3/bin/EddyId", line 11, in <module>
    load_entry_point('pyEddyTracker==0+untagged.442.g31abfe8', 'console_scripts', 'EddyId')()
  File "/home/emason/VENVP3/lib/python3.7/site-packages/pyEddyTracker-0+untagged.442.g31abfe8-py3.7.egg/py_eddy_tracker/appli/grid.py", line 92, in eddy_id
    **kwargs
  File "/home/emason/VENVP3/lib/python3.7/site-packages/pyEddyTracker-0+untagged.442.g31abfe8-py3.7.egg/py_eddy_tracker/appli/grid.py", line 119, in identification
    return grid.eddy_identification(h, u, v, date, **kwargs)
  File "/home/emason/VENVP3/lib/python3.7/site-packages/pyEddyTracker-0+untagged.442.g31abfe8-py3.7.egg/py_eddy_tracker/dataset/grid.py", line 797, in eddy_identification
    pixel_min=pixel_limit[0],
  File "/home/emason/VENVP3/lib/python3.7/site-packages/pyEddyTracker-0+untagged.442.g31abfe8-py3.7.egg/py_eddy_tracker/dataset/grid.py", line 927, in get_uavg
    max_average_speed = self.speed_coef_mean(original_contour)
  File "/home/emason/VENVP3/lib/python3.7/site-packages/pyEddyTracker-0+untagged.442.g31abfe8-py3.7.egg/py_eddy_tracker/dataset/grid.py", line 1789, in speed_coef_mean
    nan_remove=True,
  File "/home/emason/VENVP3/lib/python3.7/site-packages/numba/core/dispatcher.py", line 415, in _compile_for_args
    error_rewrite(e, 'typing')
  File "/home/emason/VENVP3/lib/python3.7/site-packages/numba/core/dispatcher.py", line 358, in error_rewrite
    reraise(type(e), e, None)
  File "/home/emason/VENVP3/lib/python3.7/site-packages/numba/core/utils.py", line 80, in reraise
    raise value.with_traceback(tb)
numba.core.errors.TypingError: Failed in nopython mode pipeline (step: nopython frontend)
Failed in nopython mode pipeline (step: nopython frontend)
No implementation of function Function(<built-in function getitem>) found for signature:

 >>> getitem(bool, UniTuple(int64 x 2))

There are 22 candidate implementations:
     - Of which 22 did not match due to:
     Overload of function 'getitem': File: <numerous>: Line N/A.
       With argument(s): '(bool, UniTuple(int64 x 2))':
      No match.

During: typing of intrinsic-call at /home/emason/VENVP3/lib/python3.7/site-packages/pyEddyTracker-0+untagged.442.g31abfe8-py3.7.egg/py_eddy_tracker/generic.py (222)

File "../../../VENVP3/lib/python3.7/site-packages/pyEddyTracker-0+untagged.442.g31abfe8-py3.7.egg/py_eddy_tracker/generic.py", line 222:
def interp2d_geo(x_g, y_g, z_g, m_g, x, y):
    <source elided>
        z11 = z_g[i1, j1]
        if m_g[i0, j0] or m_g[i0, j1] or m_g[i1, j0] or m_g[i1, j1]:
        ^

During: resolving callee type: type(CPUDispatcher(<function interp2d_geo at 0x7f2757404b90>))
During: typing of call at /home/emason/VENVP3/lib/python3.7/site-packages/pyEddyTracker-0+untagged.442.g31abfe8-py3.7.egg/py_eddy_tracker/dataset/grid.py (125)

During: resolving callee type: type(CPUDispatcher(<function interp2d_geo at 0x7f2757404b90>))
During: typing of call at /home/emason/VENVP3/lib/python3.7/site-packages/pyEddyTracker-0+untagged.442.g31abfe8-py3.7.egg/py_eddy_tracker/dataset/grid.py (125)


File "../../../VENVP3/lib/python3.7/site-packages/pyEddyTracker-0+untagged.442.g31abfe8-py3.7.egg/py_eddy_tracker/dataset/grid.py", line 125:
def mean_on_regular_contour(
    <source elided>
    x_new, y_new = uniform_resample(x_val, y_val, num_fac, fixed_size)
    values = interp2d_geo(x_g, y_g, z_g, m_g, x_new[1:], y_new[1:])
    ^

Any ideas?

Issue with tracking

I have some issues with tracking the eddies using the command line EddyTracking tracking.yaml -v DEBUG

                                                Create Variable effective_contour_latitude
        DEBUG 2020-09-30 15:54:45,315   observation.    to_netcdf :
                                                Create Variable effective_contour_longitude
        DEBUG 2020-09-30 15:54:45,323   observation.    to_netcdf :
                                                Create Variable effective_contour_shape_error
        DEBUG 2020-09-30 15:54:45,325   observation.    to_netcdf :
                                                Create Variable effective_radius
        DEBUG 2020-09-30 15:54:45,327   observation.    to_netcdf :
                                                Create Variable inner_contour_height
        DEBUG 2020-09-30 15:54:45,329   observation.    to_netcdf :
                                                Create Variable latitude
        DEBUG 2020-09-30 15:54:45,330   observation.    to_netcdf :
                                                Create Variable latitude_max
        DEBUG 2020-09-30 15:54:45,332   observation.    to_netcdf :
                                                Create Variable longitude
        DEBUG 2020-09-30 15:54:45,333   observation.    to_netcdf :
                                                Create Variable longitude_max
        DEBUG 2020-09-30 15:54:45,335   observation.    to_netcdf :
                                                Create Variable num_contours
        DEBUG 2020-09-30 15:54:45,336   observation.    to_netcdf :
                                                Create Variable num_point_e
        DEBUG 2020-09-30 15:54:45,338   observation.    to_netcdf :
                                                Create Variable num_point_s
        DEBUG 2020-09-30 15:54:45,339   observation.    to_netcdf :
                                                Create Variable speed_average
        DEBUG 2020-09-30 15:54:45,341   observation.    to_netcdf :
                                                Create Variable speed_contour_height
        DEBUG 2020-09-30 15:54:45,343   observation.    to_netcdf :
                                                Create Variable speed_contour_latitude
        DEBUG 2020-09-30 15:54:45,350   observation.    to_netcdf :
                                                Create Variable speed_contour_longitude
        DEBUG 2020-09-30 15:54:45,358   observation.    to_netcdf :
                                                Create Variable speed_contour_shape_error
        DEBUG 2020-09-30 15:54:45,360   observation.    to_netcdf :
                                                Create Variable speed_radius
        DEBUG 2020-09-30 15:54:45,362   observation.    to_netcdf :
                                                Create Variable time
        DEBUG 2020-09-30 15:54:45,364   observation.    to_netcdf :
                                                Create Variable uavg_profile
INFO 2020-09-30 15:54:45,374 tracking.prepare_merging :
                                        9657 tracks identified
INFO 2020-09-30 15:54:45,374 tracking.prepare_merging :
                                        102672 observations will be join
        DEBUG 2020-09-30 15:54:45,374   tracking.       _copy :
                                                Copy done
Traceback (most recent call last):
  File "/usr/bin/EddyTracking", line 4, in <module>
    __import__('pkg_resources').run_script('pyEddyTracker==3.1.0+36.g2e73304', 'EddyTracking')
  File "/usr/lib/python3.8/site-packages/pkg_resources/__init__.py", line 665, in run_script
    self.require(requires)[0].run_script(script_name, ns)
  File "/usr/lib/python3.8/site-packages/pkg_resources/__init__.py", line 1463, in run_script
    exec(code, namespace, namespace)
  File "/usr/lib/python3.8/site-packages/pyEddyTracker-3.1.0+36.g2e73304-py3.8.egg/EGG-INFO/scripts/EddyTracking", line 237, in <module>
    SHORT_CORRESPONDANCES.shorter_than(size_max=NB_OBS_MIN)
  File "/usr/lib/python3.8/site-packages/pyEddyTracker-3.1.0+36.g2e73304-py3.8.egg/py_eddy_tracker/tracking.py", line 510, in shorter_than
    translate = empty(i_keep_track.max() + 1, dtype='u4')
  File "/usr/lib/python3.8/site-packages/numpy/core/_methods.py", line 39, in _amax
    return umr_maximum(a, axis, None, out, keepdims, initial, where)
ValueError: zero-size array to reduction operation maximum which has no identity

This is how the script crushes. can you please assist me to solve this issue?

Using EddyTracking to compare observations and model outputs

I would like to compare eddies from observations and models by using the EddyTracking since it gives useful informations like the cost_values for the paired eddies.

However I noticed that, in the tracking.yaml file, TRACK_DURATION_MIN must be at least 3 in order to have Anticyclonic.nc (Cyclonic.nc) and the Anticyclonic_track_too_short.nc (Cyclonic_track_too_short.nc) files. If the it is less than 3, I would obtain only the correspondances and the untracked files. But I would like to compare only 2 files at a times.

Do you think it could be feasible to compare only two files and get the cost_values of the paired eddies or should I develop my own tracking for that?

image

I don't know if it may be related, but it seems you had a similar issue: #3
Also, as mentionned in that issue, in the wiki (https://github.com/AntSimi/py-eddy-tracker/wiki#how-do-i-create-my-own-tracking-recipe), from py_eddy_tracker.observations import EddiesObservations as Model should be from py_eddy_tracker.observations.observation import EddiesObservations as Model

py eddy tracker multifile analysis

Dear Sir

I wish to use py-eddy-tracker for the arabian sea. I wish to know how to employ the same for multiple file format for time series analysis. It would be great if you can comment upon the same with a sample code.

zero-size array in i_keep_track

Hi @AntSimi

First of all many thanks for making the software available!

I have the following error message when running the tracking:

Traceback (most recent call last):
  File "/mnt/lustre01/work/ab0995/a270088/miniconda3/envs/eddy/bin/EddyTracking", line 7, in <module>
    exec(compile(f.read(), __file__, 'exec'))
  File "/mnt/lustre01/pf/a/a270088/PYTHON/EDDY/py-eddy-tracker/src/scripts/EddyTracking", line 198, in <module>
    SHORT_CORRESPONDANCES.shorter_than(size_max=NB_OBS_MIN)
  File "/mnt/lustre01/pf/a/a270088/PYTHON/EDDY/py-eddy-tracker/src/py_eddy_tracker/tracking.py", line 515, in shorter_than
    translate = empty(i_keep_track.max() + 1, dtype='u4')
  File "/mnt/lustre01/work/ab0995/a270088/miniconda3/envs/eddy/lib/python3.6/site-packages/numpy/core/_methods.py", line 30, in _amax
    return umr_maximum(a, axis, None, out, keepdims, initial, where)
ValueError: zero-size array to reduction operation maximum which has no identity

It looks like the reason is that I get an empty array as a result of:

i_keep_track = where(self.nb_obs_by_tracks < size_max)[0]

My time series is quite small, only 30 days, so this might be the reason.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.