aburrell / ocbpy Goto Github PK
View Code? Open in Web Editor NEWConvert between magnetic and adaptive, polar boundary coordinates
License: BSD 3-Clause "New" or "Revised" License
Convert between magnetic and adaptive, polar boundary coordinates
License: BSD 3-Clause "New" or "Revised" License
Describe the bug
In the pysat Madrigal example using the EAB, 'Instrument' is misspelled.
To Reproduce
Steps to reproduce the behavior:
Expected behavior
The examples should work as copy-paste
Additional context
Also, the EAB example should focus on the sub-auroral region.
I used to use asserts as input checks instead of raising Errors. Go through the code and update this, for best practices.
Is your feature request related to a problem? Please describe.
Once ssj_auroral_boundaries
is no longer supported, we won't need to install CDF and can use the pyproject.toml to handle the optional installations.
Describe the solution you'd like
Remove requirements.extra
once ssj_auroral_boundaries
is completely deprecated.
Describe alternatives you've considered
Adjust the file to move the pysat install elsewhere now.
Is your feature request related to a problem? Please describe.
I would like to add flake8 testing to the CI, to catch PEP8 issues as they occur.
Describe the solution you'd like
Something along the lines of the solution presented here: https://simpleisbetterthancomplex.com/packages/2016/08/05/flake8.html
Describe alternatives you've considered
I have considered doing nothing (current process), or finding a way of only running the flake8 tests once.
Additional context
The latest release used flake8 locally to improve PEP8 compliance, so changes to the files shouldn't be onerous.
Describe the bug
The pysat custom function uses for loops instead of pandas/xarray functionality and this makes processing some datasets (like the Madrigal VTEC) too slow to be useful.
To Reproduce
Steps to reproduce the behavior:
import datetime as dt
import numpy as np
import os
import aacgmv2
import ocbpy
import pysat
import pysatMadrigal as pymad
stime = dt.datetime(2001, 1, 21)
etime = dt.datetime(2002, 1, 1)
inst_id = 'bounds'
hemisphere = 1
username = "Your Name"
password = "your@email"
def add_mag_coords(inst, lat='gdlat', lon='glon', alt='gdalt'):
"""Add AACGMV2 magnetic coordinates.
Parameters
----------
inst : pysat.Instrument
Data object
lat : str
Geodetic latitude key (default='gdlat')
lon : str
Geographic longitude key (default='glon')
alt : str
Geodetic altitude key (default='gdalt')
"""
# Initalize the data arrays
mlat = np.full(shape=tuple(inst.data.dims[kk]
for kk in ['time', lat, lon]),
fill_value=np.nan)
mlt = np.full(shape=mlat.shape, fill_value=np.nan)
# Cycle through all times, calculating magnetic locations
for i, dtime in enumerate(inst.index):
for j, gdlat in enumerate(inst[lat].values):
if alt in inst.variables:
height = inst[alt][i, j].values
else:
height = 350.0 # Default altitude is 350 km
if not np.isnan(height).all():
mlat[i, j], mlon, r = aacgmv2.convert_latlon_arr(
gdlat, inst[lon].values, height, dtime)
mlt[i, j] = aacgmv2.convert_mlt(mlon, dtime)
# Assign the magnetic data to the input Instrument
inst.data = inst.data.assign({"mlat": (("time", lat, lon), mlat),
"mlt": (("time", lat, lon), mlt)})
print("Added magnetic coordinates")
return
def add_ocb_coords(inst, ocb, ocb_suffix='ocb', mlat_name='mlat',
mlt_name='mlt', max_sdiff=60):
"""Add OCB coordinates with a custom suffix to a pysat Instrument.
Parameters
----------
inst : pysat.Instrument
Pysat Instrument object
ocb : ocbpy.OCBoundary, ocbpy.EABoundary, or ocbpy.DualBoundary object
ocbpy Boundary object
ocb_suffix : str
Desired suffix for OCB output parameters (default='ocb')
mlat_name : str
Variable name for magnetic latitude (default='mlat')
max_sdiff : int
Maximum time difference in seconds (default=60)
"""
# Add the OCB coordinates with standard labels
ocbpy.instruments.pysat_instruments.add_ocb_to_data(inst, ocb=ocb,
mlat_name=mlat_name,
mlt_name=mlt_name,
max_sdiff=max_sdiff)
# Rename the output data
if ocb_suffix != 'ocb':
ocb_vars = [var for var in inst.variables if var.find('ocb') > 0]
new_vars = {var: var.replace('ocb', ocb_suffix) for var in ocb_vars}
inst.rename(new_vars)
print("Added {:} coordinates".format(ocb_suffix))
return
tec = pysat.Instrument(inst_module=pymad.instruments.gnss_tec, tag='vtec',
user=username, password=password)
dual = ocbpy.DualBoundary(stime=stime, etime=etime, hemisphere=hemisphere)
tec.custom_attach(add_mag_coords)
tec.custom_attach(add_ocb_coords, kwargs={'ocb': dual, 'ocb_suffix': 'dual'})
dual.rec_ind = 0
tec.download(start=stime)
tec.load(date=stime) # This will take a VERY long time
Expected behavior
The function should work on this data set in a way that is useful
Desktop (please complete the following information):
Additional context
The problem is the for-loops in ocbpy.instruments.pysat_instrument.add_ocb_to_data
. The function is working correctly, it's just too slow to be used.
Describe the bug
The IMAGE acceptance criteria needs to be updated, using Gareth's new standards.
Expected behavior
Always suggest the best data to users.
Missing code of conduct, issue template, and pull request template
Is your feature request related to a problem? Please describe.
It can be difficult to support older versions for a long time. Pysat deprecates a lot of things in v3.2.0, so that would be a good updated cap.
Describe the solution you'd like
In a future version, set a cap for pysat support to a minimum of 3.2.0 or higher.
Describe alternatives you've considered
Maintaining all current versions and adding more CI testing
Additional context
Unit tests have many version checks and this makes things complicated to maintain.
Describe the bug
In the ex_convert
example, MLT is converted by dividing MLon by 15 deg/h. This isn't appropriate and aacgmv2.convert_mlt
should be used.
The other examples should also be examined for this mistake.
Running the scale_vector() function with the following parameters:
Vector data location:
mlt = 0.67159322
mlat = 51.5
Oval parameters:
oval_r = 18.20044441
center_r = 5.38516481
center_mlt = 1.4534273
Vector components:
east = -71.57035515
north = 28.
Scaling function:
Same as eq 3 in Chisham 2017, with Rb = 75.34
I get the vsign[1] variable at line 564 (the sign of the northward component of the vector represented in OCB coordinates) to be 0. This means that the sign has not been set in the calc_ocb_vec_sign() function. I think the reason is a bug in a logic operator, specififally at line 700:
I think the present code:
quads[1][2] and self.aacgm_naz > minus_pole
should be replaced with
quads[1][2] and self.aacgm_naz < minus_pole
I hope this information will make it possible to fix the bug.
When the Annales Geophysicae AMPERE OCB paper is accepted, update the references to it from the temporary references in place at the moment, that only point to the discussion paper. Also check and ensure that references to Steve's AMPERE R1/R2 FAC boundaries are included in all of the right places.
Describe the bug
Coveralls dependences for Travis no longer support python 2.7
To Reproduce
https://travis-ci.org/github/aburrell/ocbpy/jobs/723453645
Expected behavior
Success or allowed failure
Fix
Add allowed failure to travis yaml as a patch on main.
Adding instruments can be a bit time consuming, even with the general data functions supplied. Adding pysat support would incorporate support for a large number of instruments with minimal coding.
Is your feature request related to a problem? Please describe.
When releasing a new version, updating zenodo can be time consuming.
Describe the solution you'd like
Add a zenodo.json file too tell zenodo what to do
Describe alternatives you've considered
Manually updating zenodo after releases
Vector scaling is not straightforward, and should be added to the tutorial
The following routines are not completely covered by the current unit tests and should be:
Describe the bug
In the future, pip will want a pyproject.toml file.
To Reproduce
When uploading a release to pip saw:
'pyproject.toml' and the 'wheel' package is not installed. pip 23.1 will enforce this behaviour change.
Expected behavior
No deprecation warnings
Desktop (please complete the following information):
Additional context
https://pip.pypa.io/en/stable/reference/build-system/pyproject-toml/
I am not sure if this turns out to be a bug, but I just wanted to show some plots I just made, that intuitively seems a bit suspicious to me. I have attached a csv datafile of some SuperDARN gridded line-of-sight velocity data that can be read with the below script to reproduce the plot that is also attached here as a pdf.
This apparently weird behavior might be to a wrong use of the library, so I would be very grateful for any guidance if that is the case. My plot show to the left the gridded superdarn vectors plotted on the MLT/MLAT grid. Overlaid is the AMPERE circle fit, where the statistical correction to OCB has been applied (Burrell et al 2020). In this case I think the blue line matches pretty well with structures also seen in the convection. However, when representing this in the OCB frame (right panel), this seems to disorganize things. That is sort of the opposite of what was the intention of using this in the first place. Vectors that appear inside the OCB to the left are outside the OCB to the right, and vice versa. In addition, some vectors show a very large rotation between the two frames as well, which seems unrealistic.
I think I am doing something wrong here, but can not find the source of this discrepancy. My intention is to make statistics of the convection, normalized to the OCB, but I can not proceed before I can sort this out. I will be grateful for any advice here!
I am running the master branch of ocbpy, installed using pip install ocbpy on september 10, 2020 on Ubuntu 20 using Anaconda 3 (python 3.7.6).
Sample code to reproduce plot:
import ocbpy
import pandas as pd
import numpy as np
import datetime as dt
import matplotlib.pyplot as plt
def offset (mlt): #Empirical formula from Burrell et al 2020 Anngeo
o = 4.01 * (1. - 0.55**2) / (1 + 0.55 * np.cos((2.*np.pi-np.radians(mlt*15.)) + 0.92))
return o
#Global parameters
ocboffset = 3.34 #mean difference from Ampere radius to OCB
boundary = 74.34 # Based on typical size of oval for selected data interval
time = dt.datetime(2011,11,27,3,54)
#Load SuperDARN data
data = pd.read_csv('ocbpy_test.csv')
#OCBpy stuff
ocb = ocbpy.ocboundary.OCBoundary(instrument='ampere', stime=time,
etime=time + dt.timedelta(seconds=60), boundary_lat = boundary)
ocb.get_next_good_ocb_ind()
o_lat, o_mlt, r_corr = ocb.normal_coord(data.mlat, data.mlt)
dat_ind = np.arange(0, len(data.mlat))
pgr_vect = ocbpy.ocb_scaling.VectorData(dat_ind, ocb.rec_ind,
data.mlat, data.mlt,
aacgm_n=data.vlos_north, aacgm_e=data.vlos_east,
aacgm_mag=np.sqrt(data.vlos_north**2 + data.vlos_east**2),
dat_name='LoS Velocity',
dat_units='m s$^{-1}$',
scale_func=ocbpy.ocb_scaling.normal_evar)
pgr_vect.set_ocb(ocb)
#Plotting
fig = plt.figure()
ax = fig.add_subplot(121, polar=True) #Left panel just the gridded SD data and the AMPERE circle fit
theta = np.radians(np.array(data.mlt)*15)
r = np.array(90-data.mlat)
dr = np.array(data.vlos_east)
dt = np.array(data.vlos_north)
ax.quiver(theta, r, dr * np.cos(theta) - dt * np.sin(theta), dr * np.sin(theta) + dt * np.cos(theta))
ax.set_theta_zero_location("S") #This make midnight on bottom, dawn to the right
#plot the Ampere circle fit and adding the statistical correction from Burrell.
x0 = ocb.x[0]
y0 = ocb.y[0]
r = ocb.r[0]
x = np.linspace(-25,25,1000)
#obey the equation (x-x0)**2 + (y-y0)**2 = r**2
y1 = (2*y0 + np.sqrt(4*r**2 - 4*(x-x0)**2))/2
y2 = (2*y0 - np.sqrt(4*r**2 - 4*(x-x0)**2))/2
lat1 = 90 - np.sqrt(x**2 + y1**2)
lat2 = 90 - np.sqrt(x**2 + y2**2)
lat = np.concatenate([lat1,lat2])
mlt1 = np.arctan2(y1, x)*12/np.pi + 6
mlt2 = np.arctan2(y2, x)*12/np.pi + 6
mlt = np.concatenate([mlt1,mlt2])
ax.plot(np.radians(mlt*15), (90-(lat+offset(mlt))))
ax.scatter(np.arctan2(y0,x0)+np.pi/2, np.sqrt(x0**2 + y0**2), s=100)
#Right panel the converted measurements and a circle representing boundary_lat
ax2 = fig.add_subplot(122, polar=True)
theta = np.radians(np.array(o_mlt)*15)
r = np.array(90-o_lat)
dr = np.array(pgr_vect.ocb_e)
dt = np.array(pgr_vect.ocb_n)
ax2.quiver(theta, r, dr * np.cos(theta) - dt * np.sin(theta), dr * np.sin(theta) + dt * np.cos(theta))
ax2.set_theta_zero_location("S")
ax2.plot(np.radians(np.linspace(0,24,100)*15), np.ones(100)*(90-boundary))
Is your feature request related to a problem? Please describe.
__repr__
functions for a class should be short and provide the information needed to reproduce the call. See https://www.tutorialspoint.com/str-vs-repr-in-python for an example. Currently repr and str are the same in ocbpy classes.
Describe the solution you'd like
Keep str and re-write repr to be shorter and more informative. Update the tutorials and README examples appropriately.
Is your feature request related to a problem? Please describe.
Coveralls shows that the special Windows file comparison tests for SuperMAG and vorticity data are not being run, but nothing is failing in the Windows CI tests.
Describe the solution you'd like
Only have necessary lines of code in the unit tests.
Describe alternatives you've considered
Leaving things as they are.
Additional context
In the past, the AppVeyor system setup for earlier versions of Python didn't support filecmp. However, things improve and it might be possible to use it now.
Describe the bug
The new individual IMAGE "ocb" files are actually the poleward edge of the auroral oval w/o the DMSP correction.
Expected behavior
A clear explanation of what they are.
To Fix
Unit testing coverage can be improved on the main library routines.
Setuptools is deprecating the test command in favour of tox
. From the Travis-CI logs:
$ coverage run --source=ocbpy setup.py test
running test
WARNING: Testing via this command is deprecated and will be removed in a future version. Users looking for a generic test entry point independent of test runner are encouraged to use tox.
Information on tox: https://tox.readthedocs.io/en/latest/
Incorporating AACGMV2 would provide support for converting between data in geographic coordinates to OCB coordinates.
Is your feature request related to a problem? Please describe.
Support for the ssj_auroral_boundary repository will be ending shortly.
Describe the solution you'd like
Boundary files for DMSP are available here: https://zenodo.org/record/3373812#.YjoIZTfMLBs
Describe alternatives you've considered
Attempting to maintain the repository here.
Additional context
Scientific packages are not well funded and their creators frequently need to leave the discipline in order to have a happy life. While this does limit the years in which DMSP boundaries are available, future repositories could also be added if more processing eventually happens. The files may also be available on Madrigal?
Is your feature request related to a problem? Please describe.
The GUVI model data is available for specific times in various files on CDAWeb (GUVI and SSUSI).
Describe the solution you'd like
Write a module to extract the necessary model data from these files using pysatNASA (pysat/pysatNASA#148) and use it as OCBs and/or EABs.
Describe alternatives you've considered
Additionally, it would be useful to create fits for the data in the GUVI and SSUSI files.
Additional context
This is not as elegant as running the GUVI model, but I don't have access to it.
Is your feature request related to a problem? Please describe.
As described in aburrell/apexpy#123 (review), there are better ways to do file handling than what I currently have implemented.
Describe the solution you'd like
Use the current best python practices
Describe alternatives you've considered
Leaving things as they are until it causes problems.
Describe the bug
The eccentricity has the wrong sign in ocbpy.ocb_correction.ellpitical
To Fix
Change sign of the coeff key "e"
to be positive. The equation itself is coded correctly.
Allow DMSP boundaries to be used instead of a fully specified boundary circle.
Incorporate greater instrument support by adding ability to work with pysat instrument objects
Logbook functionality isn't being used, replace logbook with logging.
Remove support for python 2.7. Conditions that must be met before this can be done:
Things to do to remove 2.7 support:
I need to find a windows computer to test this on
Code allows use of AMPERE boundaries, but this use is not consistent with future plans for adding equatorward boundaries. This could be countered by normalising the AMPERE boundaries using DMSP particle boundaries, as was done with IMAGE FUV.
Describe the issue
There are several classes and functions in version 0.3.0 that are scheduled to be removed in the 0.3.1+ release.
To Resolve
Remove the deprecated functions, classes, and submodules
ocbpy.ocboundary.OCBoundary
ocbpy.ocboundary.retrieve_all_good_indices
ocbpy.ocboundary.match_data_ocb
ocbpy.ocboundary
Remove the deprecated kwargs in new/existing classes and functions
ocbpy.instruments.pysat_instruments.add_ocb_to_data
ocbpy.instruments.vort.vort2ascii_ocb
ocbpy.instruments.supermag.supermag2ascii_ocb
ocbpy.cycle_boundary.match_data_ocb
ocbpy.cycle_boundary.retrieve_all_good_indices
ocbpy._boundary.OCBoundary.get_next_good_ocb_ind
Remove the tests for the deprecation warnings and deprecated functions and classes
test_ocboundary.py
test_pysat.py
test_supermag.py
test_vort.py
test_boundary_ocb.py
test_cycle_boundary.py
Is your feature request related to a problem? Please describe.
I'm always frustrated when there's a time period where boundary observations have been made, but OCBpy doesn't include them.
Describe the solution you'd like
Add particle-precipitation corrected boundaries for the Polar satellite observations to OCBpy
Describe alternatives you've considered
Leave it alone.
Additional context
Got a dataset of Polar boundary identifications and and working on pairing them with Newell boundaries.
Describe the bug
In the README installation example, the way users are asked to run the unittests is wrong now.
Solution
Provide an example using unittest discover
The AMPERE boundaries I have are not yet openly available. Once the DMSP adjustment is made, work with S. Milan to make these open.
Due to machine precision, the variable "hav_pole_angle" can get a small but negative value. This result in NaN output on line 800 in ocb_scaling.py I think a reasonable fix could be to take abs(hav_pole_angle) before passing to archav() function.
Is your feature request related to a problem? Please describe.
The new IMAGE data will have new references. These need to be updated in the documentation and in the docstrings. However, this is waiting on the publication of a manuscript currently in preparation.
Describe the solution you'd like
Up-to-date references for the boundary data.
Describe alternatives you've considered
Leaving things out of date.
Additional context
The new release candidate should be created when the manuscripts are submitted, but not released until the manuscripts are published and the data is available for the docs.
Is your feature request related to a problem? Please describe.
A release candidate for pysat 3.0.0 is available, with a change in syntax. pysat/pysat#676
Describe the solution you'd like
The pysat instrument support and unit tests need to be updated.
Describe alternatives you've considered
Alternatively, we could require that pysat 2.2-2.3 be used.
The unit tests that require on pysat and DMSP SSJ modules aren't running on the CI platforms.
Is your feature request related to a problem? Please describe.
The normal_coord
method allows geographic inputs, but the pysat_instrument
functions do not allow the user to access this functionality.
Describe the solution you'd like
Add a kwarg to allow this access.
Describe alternatives you've considered
Adding a function to calculate magnetic coordinates using AACGMV2.
Additional context
It's better to use geographic coordinates, since their epoch won't be updated over time.
Is your feature request related to a problem? Please describe.
Requirements can be hard to keep track of. Requires.io monitors the requirements of your project and notify you whenever a dependency is outdated. It is also a single place to view the Changelogs of the different projects.
Describe the solution you'd like
Add a hook and badge for Requires.io: https://requires.io/
Describe alternatives you've considered
Keeping things as they are, to reduce the number of hooks that need to be maintained.
Additional context
Although this does add another hook, it's a maintenance one that seems unlikely to become vaporware.
Is your feature request related to a problem? Please describe.
The EAB is best used alone for data at lower latitudes
Describe the solution you'd like
Change the pysat/EAB example to plot the data outside the EAB
Describe alternatives you've considered
Add a new example
Additional context
It's best to not provide an example of the worst way to use a boundary.
Appveyor and Travis CI testing don't work with numpy versions above 1.15.4. This issue has been brought up to numpy: numpy/numpy#14012
Add issue and pull request templates
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.