tsherwen / ac_tools Goto Github PK
View Code? Open in Web Editor NEWModule for working with global/regional Chemical Transport Model (CTM) output and observations
License: MIT License
Module for working with global/regional Chemical Transport Model (CTM) output and observations
License: MIT License
problem created in funcs4plotting.get_human_readable_gradiations
Currently the analysis AC_tools code is split into two modules, one with the core functions and another with basic worked examples. There is double up and since MChem_tools embeds AC_Tools as a submodule, it creates redundancy and therefore should be removed.
Temporarily use existing landmap files. Replace these with smaller files asap.
error message:
"
{'debug': False, 'res': '4x5', 'wd': None, 'time': None}
Folder does not exist
/LANDMAP_LWI_ctm
ERROR:root:Could not get the surface area!
Traceback (most recent call last):
...
raise ValueError("Could not find the surface area")
"
Running test suite now raises errors. This is the case since updating the module structure.
e.g.
py.test
yields the following output:
========================================================== test session starts ===========================================================
platform linux -- Python 3.7.2, pytest-4.2.1, py-1.7.0, pluggy-0.8.1
rootdir: /mnt/lustre/users/ts551/labbook/Python_progs/AC_tools, inifile:
plugins: remotedata-0.3.1, openfiles-0.3.1, doctestplus-0.1.3, arraydiff-0.3
collected 35 itemstest_AC_time.py .. [ 5%]
test_GEOSChem_bpch.py .s...FF. [ 28%]
test_bpch2netCDF.py sF [ 34%]
test_core.py .......... [ 62%]
test_funcs4pf.py . [ 65%]
test_generic.py . [ 68%]
test_plotting.py Essssssss [ 94%]
test_variables.py .. [100%]================================================================= ERRORS =================================================================
________________________________________________ ERROR at setup of test_map_plot_default _________________________________________________@pytest.fixture() def test_data(): from ..GEOSChem_bpch import get_GC_output
test_data = get_GC_output(wd, species='O3')
test_plotting.py:23:
wd = '../data', vars = None, species = 'O3', category = 'IJ-AVG-$', r_cubes = False, r_res = False, restore_zero_scaling = True
r_list = False, trop_limit = False, dtype = <class 'numpy.float32'>, use_NetCDF = True, verbose = False, debug = Falsedef get_GC_output(wd, vars=None, species=None, category=None, r_cubes=False, r_res=False, restore_zero_scaling=True, r_list=False, trop_limit=False, dtype=np.float32, use_NetCDF=True, verbose=False, debug=False): """ Return data from a directory containing NetCDF/ctm.bpch files via PyGChem (>= 0.3.0 ) Parameters ---------- vars (list): variables to extract (in NetCDF form, e.g. ['IJ_AVG_S__CO']) ( alterately provide a single species and category through named input variables ) species (str): species/tracer/variable (gamap) name category (str): diagnostic category (gamap) name r_cubes (bool): To return Iris Cubes, set to True r_res (bool): To return resolution of model NetCDF set to True r_list (bool): To return data as list of arrays rather than a single array restore_zero_scaling(Boolean): restores scale to ctm.bpch standard (e.g. v/v not pptv) trop_limit(bool): limit to "chemical troposphere" (level 38 of model) dtype (type): type of variable to be returned use_NetCDF(bool): set==True to use NetCDF rather than iris cube of output verbose (bool): legacy debug option, replaced by python logging debug (bool): legacy debug option, replaced by python logging Returns ------- (np.array) of requested output or object of data (iris cube) Notes ----- - Core functionality of function: ctm.bpch files are extracted for a given directory (to a NetCDF in the directory), with only the specific category and species returned. This can either be as a Iris Cube (retaining metadata) or as a numpy array. - Credit for PyGChem: Ben Bovy - https://github.com/benbovy/PyGChem - Examples and use of pygchem is discussed on Ben Bovy's GITHib ( https://github.com/benbovy/PyGChem_examples/blob/master/Tutorials/Datasets.ipynb ) - This replaces the now defunct AC_tools functions: open_ctm_bpch and get_gc_data_np - For simplicity use variable names (species, category) as used in the iris cube ( e.g. IJ_AVG_S__CO). if variable is unknown, just print full dataset extracted to screen to see active diagnostics. - Species and category variables are maintained ( and translated ) to allow for backwards compatibility with functions written for pygchem version 0.2.0 """ # bjn # This function is not completly clear to me, and could do with a re-write # The try command would probably be useful here for large parts. # logging would also be good for replacing debug. logging.info("Called get_GC_output") logging.debug("get_GC_output inputs:") logging.debug(locals()) if not isinstance(vars, type(None)): logging.info('Opening >{}<, for var: >{}<'.format(wd, ','.join(vars)) + '(extra) gamap variables provided: >{}< + >{}<'.format(category, species)) # Also option to use gamap names ( species + category ) to and func converts these # to iris cube names if any([(not isinstance(i, type(None))) for i in (species, category)]): # convert to Iris Cube name if (category == None) and (vars == None): category = "IJ-AVG-$" if (species == None) and (vars == None): species = 'O3' # remove scaling for 'IJ-AVG-$' - KLUDGE - needed for all ? if category == 'IJ-AVG-$':
category = diagnosticname_gamap2iris(category)
E NameError: name 'diagnosticname_gamap2iris' is not defined
../GEOSChem_bpch.py:622: NameError
----------------------------------------------------------- Captured log setup -----------------------------------------------------------
GEOSChem_bpch.py 602 INFO Called get_GC_output
GEOSChem_bpch.py 603 DEBUG get_GC_output inputs:
GEOSChem_bpch.py 604 DEBUG {'wd': '../data', 'vars': None, 'species': 'O3', 'category': None, 'r_cubes': False, 'r_res': False, 'restore_zero_scaling': True, 'r_list': False, 'trop_limit': False, 'dtype': <class 'numpy.float32'>, 'use_NetCDF': True, 'verbose': False, 'debug': False}
================================================================ FAILURES ================================================================
___________________________________________________________ test_get_GC_output ___________________________________________________________def test_get_GC_output(): arr = get_GC_output(wd=wd, species='O3', category='IJ_AVG_S') assert isinstance(arr, np.ndarray), 'GC output is not a numpy array'
assert round(arr.sum(), 6) == round(
0.14242639, 6), "The ozone budget doesnt seem correct({bud})".format(bud=arr.sum())
E AssertionError: The ozone budget doesnt seem correct(0.14242638647556305)
E assert 0.142426 == 0.142426
E + where 0.142426 = round(0.14242639, 6)
E + where 0.14242639 = <built-in method sum of numpy.ndarray object at 0x7f0bc868abc0>()
E + where <built-in method sum of numpy.ndarray object at 0x7f0bc868abc0> = array([[[[2.4482466e-08],\n [2.4782828e-08],\n [2.5001304e-08],\n ...,\n [1.0980792e-06],\n...877e-08],\n ...,\n [1.0854923e-06],\n [2.3876410e-07],\n [6.5473643e-08]]]], dtype=float32).sum
E + and 0.142426 = round(0.14242639, 6)test_GEOSChem_bpch.py:70: AssertionError
----------------------------------------------------------- Captured log call ------------------------------------------------------------
GEOSChem_bpch.py 602 INFO Called get_GC_output
GEOSChem_bpch.py 603 DEBUG get_GC_output inputs:
GEOSChem_bpch.py 604 DEBUG {'wd': '../../data', 'vars': None, 'species': 'O3', 'category': 'IJ_AVG_S', 'r_cubes': False, 'r_res': False, 'restore_zero_scaling': True, 'r_list': False, 'trop_limit': False, 'dtype': <class 'numpy.float32'>, 'use_NetCDF': True, 'verbose': False, 'debug': False}
GEOSChem_bpch.py 644 DEBUG Opening netCDF file ../../data/ctm.nc
GEOSChem_bpch.py 650 DEBUG opening variable IJ_AVG_S__O3
variables.py 44 DEBUG Getting unit scaling for ppbv
______________________________________________________ test_get_HEMCO_output_for_WD ______________________________________________________def test_get_HEMCO_output_for_WD():
arr = get_HEMCO_output(wd=wd, vars='ALD2_TOTAL')
test_GEOSChem_bpch.py:76:
../GEOSChem_bpch.py:512: in get_HEMCO_output
hemco_to_netCDF(wd, hemco_file_list=files)
folder = '../../data', hemco_file_list = None, remake = False
def hemco_to_netCDF(folder, hemco_file_list=None, remake=False): """ Conbine HEMCO diagnostic output files to a single NetCDF file. Parameters ---------- remake (bool): overwrite existing NetCDF file """ if __package__ is None: from .bpch2netCDF import get_folder else: from .bpch2netCDF import get_folder folder = get_folder(folder) output_file = os.path.join(folder, 'hemco.nc') # If the hemco netCDF file already exists then quit unless remake=True if not remake: if os.path.exists(output_file): logging.warning(output_file + ' already exists, not remaking') return logging.info("Combining hemco diagnostic files") # By default look for any files that look like hemco diagnostic files: # Look for all hemco netcdf files then remove the restart files. if hemco_file_list == None: hemco_files = glob.glob(folder + '/*HEMCO*.nc') for filename in hemco_files: if "restart" in filename: hemco_files.remove(filename) else: file_list = [] for hemco_file in hemco_file_list: full_path = os.path.join(folder, hemco_file) if not os.path.exists(full_path): logging.error(full_path + " could not be found") raise IOError( "{path} could not be found".format(path=full_path)) file_list.append(full_path) hemco_files = file_list if len(hemco_files) == 0: logging.warning("No hemco diagnostic files found in {_dir}" .format(_dir=folder)) else: logging.debug("The following hemco files were found:") logging.debug(str(hemco_files)) # Use iris cubes to combine the data into an output file
hemco_data = iris.load(hemco_files)
E NameError: name 'iris' is not defined
../bpch2netCDF.py:126: NameError
----------------------------------------------------------- Captured log call ------------------------------------------------------------
GEOSChem_bpch.py 497 INFO Called get hemco output.
bpch2netCDF.py 96 INFO Combining hemco diagnostic files
bpch2netCDF.py 121 DEBUG The following hemco files were found:
bpch2netCDF.py 122 DEBUG ['../../data/HEMCO_Diagnostics.nc']
____________________________________________________________ test_get_folder _____________________________________________________________def test_get_folder(): logging.info("beginning test")
folder = get_folder(test_file_dir)
test_bpch2netCDF.py:83:
folder = '../data'
def get_folder(folder): """ Get name of folder that contains ctm.bpch data from command line """ if isinstance(folder, type(None)): # getting the folder location from system argument if len(sys.argv) <= 1: logging.warning("No folder location specified for the data") folder = os.getcwd() else: folder = str(sys.argv[1]) # Check folder exists if not os.path.exists(folder): print("Folder does not exist") print(folder)
sys.exit()
E SystemExit
../bpch2netCDF.py:284: SystemExit
---------------------------------------------------------- Captured stdout call ----------------------------------------------------------
Folder does not exist
../data
----------------------------------------------------------- Captured log call ------------------------------------------------------------
test_bpch2netCDF.py 82 INFO beginning test
============================================================ warnings summary ============================================================
test_GEOSChem_bpch.py:8
/mnt/lustre/users/ts551/labbook/Python_progs/AC_tools/AC_tools/Tests/test_GEOSChem_bpch.py:8: PytestDeprecationWarning: thepytest.config
global is deprecated. Please userequest.config
orpytest_configure
(if you're a pytest plugin) instead.
not pytest.config.getoption("--slow"),test_bpch2netCDF.py:17
/mnt/lustre/users/ts551/labbook/Python_progs/AC_tools/AC_tools/Tests/test_bpch2netCDF.py:17: PytestDeprecationWarning: thepytest.config
global is deprecated. Please userequest.config
orpytest_configure
(if you're a pytest plugin) instead.
not pytest.config.getoption("--slow"),test_plotting.py:12
/mnt/lustre/users/ts551/labbook/Python_progs/AC_tools/AC_tools/Tests/test_plotting.py:12: PytestDeprecationWarning: thepytest.config
global is deprecated. Please userequest.config
orpytest_configure
(if you're a pytest plugin) instead.
not pytest.config.getoption("--slow"),-- Docs: https://docs.pytest.org/en/latest/warnings.html
================================== 3 failed, 21 passed, 10 skipped, 3 warnings, 1 error in 4.67 seconds ==================================
Barron Henderson (EPA) has updated xarray to use pseudonetcdf as a backend to open ctm.bpch files. PyGChem previously used an iris backend to read ctm.bpch, however iris is/was plagued with compatibility issues and PyGChem is no longer maintained. Now use xarray's pseudonetcdf backend.
Allow multiple types of inputs such as a single bpch file, a single planeflight file, or a folder.
funcs4GEOSC.get_O3_burden expects a 4d array returning from get_GC_output.
In the test suit it is failing because the test netCDF file is only 1 month and so has no time dimension.
Is this a problem with get_GC_output or get_O3_burden.
Should ctm data with 1 time unit have a null dimension inserted, or should get_GC_output be able to take 3 dimension data?
The ctm.bpch conversion test function (test_convert_to_netCDF) is currently failing with the below error. The "test_files" directory is not being generated when "py.test --slow" is run from fresh, is this the issue?
========================================================================== FAILURES ==========================================================================
___________________________________________________________________ test_convert_to_netCDF ___________________________________________________________________
@slow
def test_convert_to_netCDF():
logging.info("beginning test")
# Recreate a ctm.nc file and confirm it is the same
logging.debug("Creating the temp netCDF file")
convert_to_netCDF(folder=test_file_dir, bpch_file_list=['test.bpch'], remake=True, filename='test.nc')
datafile = os.path.join(test_file_dir, 'ctm.nc')
testfile = os.path.join(test_file_dir, 'test.nc')
logging.debug("Comparing the temp netCDF file to the origional")
assert file_comparison(datafile, testfile), \
'bpch converter failed to replicate the original file.'
E AssertionError: bpch converter failed to replicate the original file.
E assert file_comparison('../data/ctm.nc', '../data/test.nc')
test_bpch2netCDF.py:61: AssertionError
-------------------------------------------------------------------- Captured stdout call --------------------------------------------------------------------
Creating a netCDF from 1 file(s). This can take some time...
============================================================ 1 failed, 33 passed in 92.23 seconds ============================================================
The 74, bendev, dev.red branches were combined into a single dev branch dev.main. Conflicts were resolved for map_plot whilst retaining back compatibility, however new updates to map_plot could not be included. A plot_map function has therefore been created. Updates to ticks/colorbars/default/general tidiness plotting should be applied to this and eventually map_plot will be retired.
In a plot that shows the difference between two things, 0 is a shade of red apposed to white.
Hi,
I install AC_tools by
python setup.py install
.
However, I get this imformation when I try to import AC_tools
ephem package not installed
failed to import geopandas
failed to import rasterio
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/glade/u/home/lixujin/AC_tools/AC_tools/__init__.py", line 12, in <module>
from . mask import *
File "/glade/u/home/lixujin/AC_tools/AC_tools/mask.py", line 33, in <module>
from affine import Affine
ModuleNotFoundError: No module named 'affine'
So I install affine, ephem, geopandas and raterio in the environment, expecting this bug could be fixed.
But I still get error information below. I wonder if you have any suggestions on this kind of installation error.
>>> import AC_tools
failed to import geopandas
failed to import rasterio
My packages in the environment are listed below. Hope it helps.
# Name Version Build Channel
_libgcc_mutex 0.1 main
ac-tools 0.0.1 pypi_0 pypi
affine 2.3.0 py_0 anaconda
antlr-python-runtime 4.7.2 py37_1000 conda-forge
asn1crypto 0.24.0 py37_1003 conda-forge
atomicwrites 1.3.0 pypi_0 pypi
attrs 19.2.0 pypi_0 pypi
basemap 1.2.0 py37h673bf1a_2 conda-forge
basemap-data-hires 1.2.0 0
bokeh 1.3.4 py37_0 conda-forge
boost-cpp 1.70.0 ha2d47e9_1 conda-forge
boto3 1.9.241 py_0 conda-forge
botocore 1.12.241 py_0 conda-forge
bzip2 1.0.8 h516909a_1 conda-forge
ca-certificates 2019.9.11 hecc5488_0 conda-forge
cairo 1.14.12 h8948797_3
cartopy 0.17.0 py37h0aa2c8f_1004 conda-forge
certifi 2019.9.11 py37_0 conda-forge
cf-units 2.1.3 py37hc1659b7_0 conda-forge
cffi 1.12.3 py37h8022711_0 conda-forge
cftime 1.0.3.4 py37hd352d35_1001 conda-forge
chardet 3.0.4 py37_1003 conda-forge
click 7.0 py_0 conda-forge
click-plugins 1.1.1 py_0 conda-forge
cligj 0.5.0 py_0 conda-forge
cloudpickle 1.2.2 py_0 conda-forge
cryptography 2.7 py37h72c5cf5_0 conda-forge
curl 7.65.3 hf8cf82a_0 conda-forge
cycler 0.10.0 py_1 conda-forge
cytoolz 0.10.0 py37h516909a_0 conda-forge
dask 2.5.0 py_0 conda-forge
dask-core 2.5.0 py_0 conda-forge
dbus 1.13.2 h714fa37_1
distributed 2.5.1 py_0 conda-forge
docutils 0.15.2 py37_0 conda-forge
ephem 3.7.7.0 py37h7b6447c_0 anaconda
expat 2.2.6 he6710b0_0
fiona 1.8.4 py37heb36068_1001 conda-forge
fontconfig 2.13.0 h9420a91_0
freetype 2.10.0 he983fc9_1 conda-forge
freexl 1.0.5 h14c3975_1002 conda-forge
fsspec 0.5.1 py_0 conda-forge
gdal 2.3.3 py37hbb2a789_0
geopandas 0.6.0 py_0 conda-forge
geos 3.7.1 hf484d3e_1000 conda-forge
gettext 0.19.8.1 hc5be6a0_1002 conda-forge
giflib 5.1.9 h516909a_0 conda-forge
glib 2.56.2 had28632_1001 conda-forge
gst-plugins-base 1.14.0 hbbd80ab_1
gstreamer 1.14.0 hb453b48_1
hdf4 4.2.13 h9a582f1_1002 conda-forge
hdf5 1.10.4 nompi_h3c11f04_1106 conda-forge
heapdict 1.0.1 py_0 conda-forge
icu 58.2 hf484d3e_1000 conda-forge
idna 2.8 py37_1000 conda-forge
importlib-metadata 0.23 pypi_0 pypi
iris 2.2.1 py37_0 conda-forge
jinja2 2.10.1 py_0 conda-forge
jmespath 0.9.4 py_0 conda-forge
jpeg 9c h14c3975_1001 conda-forge
json-c 0.13.1 h14c3975_1001 conda-forge
kealib 1.4.10 h1978553_1003 conda-forge
kiwisolver 1.1.0 py37hc9558a2_0 conda-forge
krb5 1.16.3 h05b26f9_1001 conda-forge
libblas 3.8.0 12_openblas conda-forge
libcblas 3.8.0 12_openblas conda-forge
libcurl 7.65.3 hda55be3_0 conda-forge
libdap4 3.19.1 h6ec2957_0
libedit 3.1.20181209 hc058e9b_0 anaconda
libffi 3.2.1 he1b5a44_1006 conda-forge
libgcc-ng 9.1.0 hdf63c60_0
libgdal 2.3.3 h2e7e64b_0
libgfortran 3.0.0 1 conda-forge
libgfortran-ng 7.3.0 hdf63c60_0
libiconv 1.15 h516909a_1005 conda-forge
libkml 1.3.0 h4fcabce_1010 conda-forge
liblapack 3.8.0 12_openblas conda-forge
libnetcdf 4.6.2 hbdf4f91_1001 conda-forge
libopenblas 0.3.7 h6e990d7_1 conda-forge
libpng 1.6.37 hed695b0_0 conda-forge
libpq 11.4 h4e4e079_0 conda-forge
libspatialindex 1.9.0 he1b5a44_1 conda-forge
libspatialite 4.3.0a hb5ec416_1026 conda-forge
libssh2 1.8.2 h22169c7_2 conda-forge
libstdcxx-ng 9.1.0 hdf63c60_0
libtiff 4.0.10 h57b8799_1003 conda-forge
libuuid 1.0.3 h1bed415_2
libxcb 1.13 h14c3975_1002 conda-forge
libxml2 2.9.9 h13577e0_2 conda-forge
locket 0.2.0 py_2 conda-forge
lz4-c 1.8.3 he1b5a44_1001 conda-forge
markupsafe 1.1.1 py37h14c3975_0 conda-forge
matplotlib 2.2.3 py37hb69df0a_0
more-itertools 7.2.0 pypi_0 pypi
msgpack-python 0.6.2 py37hc9558a2_0 conda-forge
munch 2.3.2 py_0 conda-forge
ncurses 6.1 hf484d3e_1002 conda-forge
netcdf4 1.5.1.2 py37had58050_0 conda-forge
numpy 1.17.2 py37h95a1406_0 conda-forge
olefile 0.46 py_0 conda-forge
openjpeg 2.3.1 h21c5421_1 conda-forge
openssl 1.1.1c h516909a_0 conda-forge
owslib 0.18.0 py_0 conda-forge
packaging 19.2 py_0 conda-forge
pandas 0.25.1 py37hb3f55d8_0 conda-forge
partd 1.0.0 py_0 conda-forge
pcre 8.43 he6710b0_0
pillow 6.2.0 py37h6b7be26_0 conda-forge
pip 19.2.3 py37_0 conda-forge
pixman 0.38.0 h516909a_1003 conda-forge
pluggy 0.13.0 pypi_0 pypi
poppler 0.72.0 h2fc8fa2_1000 conda-forge
poppler-data 0.4.9 1 conda-forge
proj4 5.2.0 he1b5a44_1006 conda-forge
psutil 5.6.3 py37h516909a_0 conda-forge
pthread-stubs 0.4 h14c3975_1001 conda-forge
py 1.8.0 pypi_0 pypi
pycparser 2.19 py37_1 conda-forge
pyepsg 0.4.0 py_0 conda-forge
pyhdf 0.10.1 py37h3a4e923_1 conda-forge
pykdtree 1.3.1 py37h3010b51_1002 conda-forge
pyke 1.1.1 py37_1000 conda-forge
pyopenssl 19.0.0 py37_0 conda-forge
pyparsing 2.4.2 py_0 conda-forge
pyproj 1.9.6 py37h516909a_1002 conda-forge
pyqt 5.9.2 py37h05f1152_2
pyshp 2.1.0 py_0 conda-forge
pysocks 1.7.1 py37_0 conda-forge
pytest 5.2.0 pypi_0 pypi
python 3.7.3 h5b0a415_0 conda-forge
python-dateutil 2.8.0 py_0 conda-forge
pytz 2019.2 py_0 conda-forge
pyyaml 5.1.2 py37h516909a_0 conda-forge
qt 5.9.7 h5867ecd_1
rasterio 1.0.21 py37hc38cc03_0
readline 7.0 hf8c457e_1001 conda-forge
requests 2.22.0 py37_1 conda-forge
rtree 0.8.3 py37h666c49c_1002 conda-forge
s3transfer 0.2.1 py37_0 conda-forge
scipy 1.3.1 py37h921218d_2 conda-forge
setuptools 41.2.0 py37_0 conda-forge
shapely 1.6.4 py37h06cd6f9_1005 conda-forge
sip 4.19.8 py37hf484d3e_1000 conda-forge
six 1.12.0 py37_1000 conda-forge
snuggs 1.4.7 py_0 conda-forge
sortedcontainers 2.1.0 py_0 conda-forge
sqlite 3.28.0 h8b20d00_0 conda-forge
tblib 1.4.0 py_0 conda-forge
tk 8.6.9 hed695b0_1003 conda-forge
toolz 0.10.0 py_0 conda-forge
tornado 6.0.3 py37h516909a_0 conda-forge
udunits2 2.2.27.6 h4e0c4b3_1001 conda-forge
urllib3 1.25.6 py37_0 conda-forge
wcwidth 0.1.7 pypi_0 pypi
wheel 0.33.6 py37_0 conda-forge
xarray 0.13.0 pypi_0 pypi
xerces-c 3.2.2 hea5cb30_1003 conda-forge
xorg-libxau 1.0.9 h14c3975_0 conda-forge
xorg-libxdmcp 1.1.3 h516909a_0 conda-forge
xz 5.2.4 h14c3975_1001 conda-forge
yaml 0.1.7 h14c3975_1001 conda-forge
zict 1.0.0 py_0 conda-forge
zipp 0.6.0 pypi_0 pypi
zlib 1.2.11 h516909a_1006 conda-forge
zstd 1.4.0 h3b9ef0a_0 conda-forge
Lixu
Example
PAN = Species('PAN')
PAN.rmm = 20
PAN.name = CNMEPSLE (Chemistry...)
PAN.LaTeX = CA$_2$OMNS
PAN.InChI
PAN.carbons = get_carbons(self.InChI)
The CSV format from MChem tools works ok enough to adapt.
Conda appears to no longer be able to build the base environment from the shipped yaml file.
To resolve this, a few modules just used for more complex polygon geospatial masking have been dropped (listed below; #121) in the default yaml file provided. The choice for Conda channels was updated to use 'defaults' instead of 'conda-forge' too (commit #120).
These changes mean that Conda can now again quickly make the base environment from the yaml file using the standard approach (below & detailed on Conda's website)
conda env create -f environment.yml
A long term solution to allow for the polygon masking to be used again still needs to be tested and implemented. In the meantime, the user will have to simply install these additional packages if they wish to use the functions that require them.
Various other dependencies could be swapped for core distributions and/or more up to date packages on (e.g. wget
, requests
, BeautifulSoup
). Other less used packages (e.g. xesmf
) could be encouraged to be installed as required.
Currently, creation of a species class object requires reading a CSV and doing some minor processing. This should be done offline to speed up class creation and other alternatives explored too.
Currently, legacy code means a mix of camel case and lower case names is being used. Decide on approach (Camel Case?), then use consistently.
Also, some legacy functions have unnecessarily protracted names (e.g. below). Update these to allow for cleaner code. can retain wrapper functions for core functions as a stopgap approach (e.g. as done for get_stats4RunDict_as_df
versus older get_general_stats4run_dict_as_df
).
AC.rm_fractional_troposphere()
could be AC.rm_frac_trop()
This will make many external uses of AC_tools no longer back compatible. However, the older releases of the code can just be checked out to use older code bases.
Include a set of core diagnostics for NetCDF GEOS-Chem output, like are present for bpch output.
This will require updates to return xr.datasets in the same form as in NetCDFs. Currently, the datasets/arrays created follow Met Office (non ESMCF/COARDS format)
These functions are present in the separate GEOSChem_nc.py file (linked below). And the existing files for working with bpch output have been moved to GEOSChem_bpch.py
https://github.com/tsherwen/AC_tools/blob/master/AC_tools/GEOSChem_nc.py
Inc. test in unit testing for these and some example files in the remote
Add a script to create bench mark comparisons or two 1 month or 1 year runs. This would generate zonal, surface, and column plots for 47 and 72 level GC output for a few key species (OH, HO2, O3 etc) or a longer list. A change log text file would also be created that listed area and mass weighted changes.
Currently convert_to_netCDF will not work on directories that contain a mixture of timestptamp files (ts*.bpch), ctm output (ctm.bpch or trac), HEMCO etc. This should be updated to process only the file type that convert_to_netCDF is called for and have a option to process all output.
Add a check to make sure the online data is not different from the actual data.
Options could be a mdf5 checksum, or just a simple size comparison.
size comparison recomended for now.
Hey guys,
I've installed AC_tools in python2.7.5 environment with all the required packages installed (iris1.13.0, Pygchem0.3.0 etc). But when I import AC_tools in python, the bug report is as follows:
import AC_tools as AC
Traceback (most recent call last):
File "", line 1, in
File "/home/BaiXiaoxuan/anaconda3/envs/py2_env/lib/python2.7/site-packages/AC_tools-0.1.1-py2.7.egg/AC_tools/init.py", line 7, in
from . core import *
File "/home/BaiXiaoxuan/anaconda3/envs/py2_env/lib/python2.7/site-packages/AC_tools-0.1.1-py2.7.egg/AC_tools/core.py", line 12, in
from netCDF4 import Dataset
File "/home/BaiXiaoxuan/anaconda3/envs/py2_env/lib/python2.7/site-packages/netCDF4/init.py", line 3, in
from ._netCDF4 import *
ImportError: /home/BaiXiaoxuan/anaconda3/envs/py2_env/lib/python2.7/site-packages/netCDF4/../../.././libcurl.so.4: undefined symbol: SSL_COMP_free_compression_methods
Do anybody know what's wrong? Possibly the version of packages conflict?
By the way, I have to use AC_tools in python2 because of the bpch2nc fnuction : )
And my conda list:
_libgcc_mutex 0.1 conda_forge conda-forge
_openmp_mutex 4.5 2_gnu conda-forge
ac-tools 0.1.1 pypi_0 pypi
asn1crypto 1.4.0 pyh9f0ad1d_0 conda-forge
backports_abc 0.5 py_1 conda-forge
biggus 0.15.0 py_1 conda-forge
blas 1.1 openblas conda-forge
bokeh 0.13.0 py27_0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main
bottleneck 1.2.1 py27h3010b51_1001 conda-forge
brotlipy 0.7.0 py27h516909a_1000 conda-forge
cairo 1.12.18 6 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/free
cartopy 0.17.0 py27h0aa2c8f_1004 conda-forge
certifi 2019.11.28 py27h8c360ce_1 conda-forge
cf_units 1.2.0 py27_0 conda-forge
cffi 1.14.0 py27hd463f26_0 conda-forge
cftime 1.0.3.4 py27hd352d35_1001 conda-forge
chardet 3.0.4 py27h8c360ce_1006 conda-forge
cloudpickle 0.4.0 py_1 conda-forge
colorama 0.4.4 pyh9f0ad1d_0 conda-forge
conda 4.8.3 py27h8c360ce_1 conda-forge
conda-package-handling 1.6.0 py27hdf8410d_2 conda-forge
cryptography 2.2.1 py27_0 conda-forge
curl 7.52.1 0 conda-forge
cycler 0.10.0 py27_0 conda-forge
cyordereddict 1.0.0 py27h516909a_1002 conda-forge
dask 0.15.2 py27_0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/free
dbus 1.13.6 he372182_0 conda-forge
ecmwf_grib 1.23.1 0 conda-forge
enum34 1.1.10 py27h8c360ce_1 conda-forge
ephem 3.7.6.0 py27_0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/free
expat 2.4.8 h27087fc_0 conda-forge
fontconfig 2.11.1 6 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/free
freetype 2.5.5 2 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/free
functools32 3.2.3.2 py_3 conda-forge
futures 3.3.0 py27h8c360ce_1 conda-forge
geos 3.7.1 hf484d3e_1000 conda-forge
gettext 0.19.8.1 hf34092f_1004 conda-forge
glib 2.58.3 py27he9b9f4b_1003 conda-forge
gst-plugins-base 1.14.0 hbbd80ab_1 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main
gstreamer 1.14.0 hb453b48_1 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main
h5netcdf 0.4.0 py27_0 conda-forge
h5py 2.7.0 np111py27_0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/free
hdf4 4.2.12 0 conda-forge
hdf5 1.8.17 11 conda-forge
icu 54.1 0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/free
idna 2.10 pyh9f0ad1d_0 conda-forge
ipaddress 1.0.23 py_0 conda-forge
iris 1.13.0 py27h24bf2e0_2 conda-forge
iris-grib 0.10.1 py27_0 conda-forge
jasper 1.900.1 1 conda-forge
jbig 2.1 h7f98852_2003 conda-forge
jinja2 2.11.3 pyh44b312d_0 conda-forge
jpeg 9e h166bdaf_1 conda-forge
krb5 1.13.2 0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/free
libblas 3.9.0 15_linux64_openblas conda-forge
libcblas 3.9.0 15_linux64_openblas conda-forge
libffi 3.2.1 he1b5a44_1007 conda-forge
libgcc 7.2.0 h69d50b8_2 conda-forge
libgcc-ng 12.1.0 h8d9b700_16 conda-forge
libgfortran 3.0.0 1 conda-forge
libgfortran-ng 12.1.0 h69a702a_16 conda-forge
libgfortran5 12.1.0 hdcd56e2_16 conda-forge
libgomp 12.1.0 h8d9b700_16 conda-forge
libiconv 1.16 h516909a_0 conda-forge
liblapack 3.9.0 15_linux64_openblas conda-forge
libmo_unpack 3.1.2 hf484d3e_1001 conda-forge
libnetcdf 4.4.1 0 conda-forge
libopenblas 0.3.20 h043d6bf_1 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main
libpng 1.6.37 h21135ba_2 conda-forge
libsolv 0.7.22 h6239696_0 conda-forge
libstdcxx-ng 12.1.0 ha89aaad_16 conda-forge
libtiff 4.0.6 7 conda-forge
libxcb 1.13 h7f98852_1004 conda-forge
libxml2 2.9.4 0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/free
libzlib 1.2.12 h166bdaf_0 conda-forge
locket 1.0.0 pyhd8ed1ab_0 conda-forge
mamba 0.1.0 py27hc9558a2_2 conda-forge
markupsafe 1.1.1 py27hdf8410d_1 conda-forge
matplotlib 1.5.1 np111py27_0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/free
mo_pack 0.2.0 py27h3010b51_1003 conda-forge
nc-time-axis 1.2.0 py_0 conda-forge
ncurses 5.9 10 conda-forge
netcdf4 1.2.4 np111py27_0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/free
numpy 1.11.3 py27_blas_openblas_202 [blas_openblas] conda-forge
openblas 0.2.19 2 conda-forge
openssl 1.0.1k 1 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/free
owslib 0.18.0 py_0 conda-forge
packaging 16.8 py27_0 conda-forge
pandas 0.22.0 py27_1 conda-forge
partd 0.3.8 py27_0 conda-forge
pcre 8.45 h9c3ff4c_0 conda-forge
pillow 3.2.0 py27_1 conda-forge
pip 20.0.2 py27_1 conda-forge
pixman 0.32.6 0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/free
proj4 5.2.0 he1b5a44_1006 conda-forge
pthread-stubs 0.4 h36c2ea0_1001 conda-forge
py 1.11.0 pyh6c4a22f_0 conda-forge
pycairo 1.10.0 py27_0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/free
pycosat 0.6.3 py27hdf8410d_1004 conda-forge
pycparser 2.21 pyhd8ed1ab_0 conda-forge
pyepsg 0.4.0 py_0 conda-forge
pygchem 0.3.0.dev0 pypi_0 pypi
pykdtree 1.3.1 py27h3010b51_1002 conda-forge
pyke 1.1.1 py27h8c360ce_1002 conda-forge
pyopenssl 19.0.0 py27_0 conda-forge
pyparsing 2.4.7 pyh9f0ad1d_0 conda-forge
pyproj 1.9.6 py27h516909a_1002 conda-forge
pyqt 4.11.4 py27_3 conda-forge
pyshp 2.1.3 pyh44b312d_0 conda-forge
pysocks 1.7.1 py27h8c360ce_1 conda-forge
pytest 2.8.3 py27_0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/cloud/menpo
python 2.7.5 3 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/free
python-dateutil 2.8.1 py_0 conda-forge
python-ecmwf_grib 1.23.1 py27_0 conda-forge
python_abi 2.7 1_cp27mu conda-forge
pytz 2019.3 py_0 conda-forge
pyyaml 5.3.1 py27hdf8410d_0 conda-forge
qt 4.8.7 2 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/free
readline 6.2 0 conda-forge
requests 2.25.1 pyhd3deb0d_0 conda-forge
ruamel_yaml 0.15.80 py27hdf8410d_1001 conda-forge
scipy 0.19.1 py27_blas_openblas_202 [blas_openblas] conda-forge
setuptools 36.4.0 py27_1 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/free
shapely 1.6.4 py27h2afed24_1004 conda-forge
singledispatch 3.6.1 pyh44b312d_0 conda-forge
sip 4.18 py27_0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/free
six 1.16.0 pyh6c4a22f_0 conda-forge
sqlite 3.13.0 0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/free
subprocess32 3.5.4 py27h516909a_0 conda-forge
system 5.8 2 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/free
tk 8.5.18 0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/free
toolz 0.8.2 py27_1 conda-forge
tornado 5.1.1 py27h14c3975_1000 conda-forge
tqdm 4.64.0 pyhd8ed1ab_0 conda-forge
udunits2 2.2.28 hc3e0081_0 conda-forge
urllib3 1.26.9 pyhd8ed1ab_0 conda-forge
wheel 0.37.1 pyhd8ed1ab_0 conda-forge
xarray 0.10.0 py27_0 conda-forge
xorg-libxau 1.0.9 h7f98852_0 conda-forge
xorg-libxdmcp 1.1.3 h7f98852_0 conda-forge
xz 5.2.5 h516909a_1 conda-forge
yaml 0.2.5 h7f98852_2 conda-forge
zlib 1.2.12 h166bdaf_0 conda-forge
The install_requires
argument to setup
in setup.py
doesn't appear to correctly capture the minimal dependencies required for the package to be used.
In addition to the packages listed as requirements in setup.py
, I needed to install the following packages into my Python environment in order to allow the AC_tools package to be successfully imported:
Compounding the issue, installation of Cartopy requires the installation of both GEOS and Proj4, dependencies which aren't satisfied when installing Cartopy from PyPI through a call to pip install
. This could potentially be alienating for new users of the package, depending on how straightforward the installation process for these libraries is on their operating system.
Although the above dependencies allow AC_tools to be imported, the user is still presented with warnings about the following packages not being present in the environment:
I assume that, since these aren't causing failures in the importing of AC_tools, these modules aren't critical to package functionality, and provide optional features.
I think a simple solution to the above would be to provide a Condaenvironment.yaml
file at the top level of this repository, which specifies the Python and non-Python dependencies for the project (all available fromconda-forge
). It would seem that PyPI is not necessarily a sensible target for an eventual home for this package, due to the poor handling of non-Python dependencies there.
Apologies I've missed something obvious here!
The current headers in the AC_tools files are inconsistent and carry information from previous versions. This require updates to all function modules.
@tsherwen Do you mind if I reduce the number of lat / lon ticks and colorbar points down by default?
Can we add latitude_bnds back into the netCDF files. This is used in:
funcs4core.get_latlonalt4res()
This is called by funcs4plotting.map_plot
The tests are failing because of this.
Build up a list of links where AC_tools has been used in publications etc and add to a README page or wiki. The repository has a citable Zenodo DOI which should be included in publications that use this repository.
Almost all post-2015 papers on Google Scholar profile as led- or co- author (~30 of ~40).
Some recent examples are:
Carpenter, LJ, Chance, RJ, Sherwen, T, Adams, TJ, Ball, SM, Evans, MJ, Hepach, H, Hollis, LDJ, Hughes, C, Jickells, TD, Mahajan, A, Stevens, DP, Tinel, L & Wadley, MR 2021, 'Marine iodine emissions in a changing world', Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences, vol. 477, no. 2247, 20200824. https://doi.org/10.1098/rspa.2020.0824, https://doi.org/10.1098/rspa.2020.0824
Chance, R.J., Tinel, L., Sherwen, T., Baker, A.R., Bell, T., Brindle, J., Campos, M.L.A., Croot, P., Ducklow, H., Peng, H. and Hopkins, F., 2019. Global sea-surface iodide observations, 1967–2018. Scientific data, 6(1), pp.1-8.
Wang, X., Jacob, D. J., Downs, W., Zhai, S., Zhu, L., Shah, V., Holmes, C. D., Sherwen, T., Alexander, B., Evans, M. J., Eastham, S. D., Neuman, J. A., Veres, P., Koenig, T. K., Volkamer, R., Huey, L. G., Bannan, T. J., Percival, C. J., Lee, B. H., and Thornton, J. A.: Global tropospheric halogen (Cl, Br, I) chemistry and its impact on oxidants, Atmos. Chem. Phys. Discuss. [preprint], https://doi.org/10.5194/acp-2021-441, in review, 2021.
Sherwen, T., Chance, R. J., Tinel, L., Ellis, D., Evans, M. J., and Carpenter, L. J.: A machine-learning-based global sea-surface iodide distribution, Earth Syst. Sci. Data, 11, 1239–1262, https://doi.org/10.5194/essd-11-1239-2019, 2019.
Where code or outputs using AC_tools have been included in papers, but not as a co-author.
Hackenberg, S. C., et al. (2017), Potential controls of isoprene in the surface ocean, Global Biogeochem. Cycles, 31, 644–662, doi:10.1002/2016GB005531.
Chance, R, Tinel, LLG, Sarkar, A, Sinha, A, Mahajan, A, Chacko, R, Sabu, P, Roy, R, Jickells, T, Stevens, D, Wadley, M & Carpenter, LJ 2020, 'Surface Inorganic Iodine Speciation in the Indian and Southern Oceans From 12°N to 70°S', Frontiers in Marine Science, pp. 1-16. https://doi.org/10.3389/fmars.2020.00621
A core bottleneck is the scalability of reads of many NetCDF files via xarray/dask.
This has been improved by many updates. However, this remains a core bottleneck for analysis.
Pre-xarray direct reading and handling of NetCDF files was faster and more scaleable. However, the benefits brought by using xarray/dask are very notable.
I am trying to convert a punch file to netCDF using the "AC_tools" but it shows me the following error when I use the "convert_to_netCDF" function:
_File "/home/i/irs11/miniconda3/lib/python3.7/site-packages/AC_tools/bpch2netCDF.py", line 209, in bpch_to_netCDF
bpch_data = datasets.load(bpch_files)
NameError: name 'datasets' is not defined_
I'm using the following packages: python 3.7, iris 2.2.0, numpy 1.17.2, netcdf4 1.5.3 .....
I don't have the pygchem package because I am not able to install it in python 3.7. Consequently, I've tried to use an environment with python 2.7, but in such case AC_tools needs python=>3.5 to be installed. So I got stuck.
Any suggestion?
We are missing a landmap for 4x5
py.test now fails for AC_tools on "test_get_HEMCO_output_for_WD". I think the data is no included in the download list?
"
ts551@earth:Tests$ py.test
Downloading file. This might take some time.
Download complete.
Downloading file. This might take some time.
Download complete.
Downloading file. This might take some time.
Download complete.
Downloading file. This might take some time.
Download complete.
==================================================== test session starts ====================================================
platform linux2 -- Python 2.7.12 -- py-1.4.27 -- pytest-2.7.1
rootdir: /work/home/ts551/labbook/Python_progs/AC_tools, inifile:
collected 34 items
test_bpch2netCDF.py s.
test_funcs4GEOSC.py .s....F.
test_funcs4core.py .........
test_funcs4generic.py .
test_funcs4pf.py .
test_funcs4plotting.py .ssssssss
test_funcs4time.py ..
test_funcs4vars.py ..
========================================================= FAILURES ==========================================================
_______________________________________________ test_get_HEMCO_output_for_WD ________________________________________________
def test_get_HEMCO_output_for_WD():
arr = get_HEMCO_output(wd=wd, vars='ALD2_TOTAL')
test_funcs4GEOSC.py:75:
../funcs4GEOSC.py:740: in get_HEMCO_output
HEMCO_data = Dataset(fname, 'r')
???
E IOError: No such file or directory
netCDF4/_netCDF4.pyx:1811: IOError
====================================== 1 failed, 23 passed, 10 skipped in 5.59 seconds ======================================
"
Allow for a list of variables to be read in as a dictionary instead of the current (e.g. AC.GC_var()) approach.
Currently, variables are stored in an inline list/dict in variables.py, but this would be cleaner if stored in a yml file.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.