Comments (8)
When updating xarray from 0.15.0 to 0.16.0, the error changes to:
see error
---------------------------------------------------------------------------
ImportError Traceback (most recent call last)
<ipython-input-1-9024338e9664> in <module>
9 url = "abfs://majodata/M040219.1H-01/calibration/left.nc"
10 # url = '/gscratch/home/a-banijh/data/CPH/M040219.1H-01/phase_two/01/left.nc'
---> 11 with fsspec.open(url, **STORAGE_OPTIONS) as f:
12 # x = f.readlines()
13 ds = xr.open_dataset(f)
~/miniconda3/envs/majoanalysis/lib/python3.8/site-packages/fsspec/core.py in open(urlpath, mode, compression, encoding, errors, protocol, newline, **kwargs)
267 ``OpenFile`` object.
268 """
--> 269 return open_files(
270 [urlpath],
271 mode,
~/miniconda3/envs/majoanalysis/lib/python3.8/site-packages/fsspec/core.py in open_files(urlpath, mode, compression, encoding, errors, name_function, num, protocol, newline, **kwargs)
197 List of ``OpenFile`` objects.
198 """
--> 199 fs, fs_token, paths = get_fs_token_paths(
200 urlpath,
201 mode,
~/miniconda3/envs/majoanalysis/lib/python3.8/site-packages/fsspec/core.py in get_fs_token_paths(urlpath, mode, num, name_function, storage_options, protocol)
375 "share the same protocol"
376 )
--> 377 cls = get_filesystem_class(protocol)
378 optionss = list(map(cls._get_kwargs_from_urls, urlpath))
379 paths = [cls._strip_protocol(u) for u in urlpath]
~/miniconda3/envs/majoanalysis/lib/python3.8/site-packages/fsspec/registry.py in get_filesystem_class(protocol)
99 raise ValueError("Protocol not known: %s" % protocol)
100 bit = known_implementations[protocol]
--> 101 registry[protocol] = _import_class(bit["class"])
102 cls = registry[protocol]
103 if getattr(cls, "protocol", None) in ("abstract", None):
~/miniconda3/envs/majoanalysis/lib/python3.8/site-packages/fsspec/registry.py in _import_class(cls, minv)
112 minversion = minv.get(mod, None)
113
--> 114 mod = importlib.import_module(mod)
115 if minversion:
116 version = getattr(mod, "__version__", None)
~/miniconda3/envs/majoanalysis/lib/python3.8/importlib/__init__.py in import_module(name, package)
125 break
126 level += 1
--> 127 return _bootstrap._gcd_import(name[level:], package, level)
128
129
~/miniconda3/envs/majoanalysis/lib/python3.8/importlib/_bootstrap.py in _gcd_import(name, package, level)
~/miniconda3/envs/majoanalysis/lib/python3.8/importlib/_bootstrap.py in _find_and_load(name, import_)
~/miniconda3/envs/majoanalysis/lib/python3.8/importlib/_bootstrap.py in _find_and_load_unlocked(name, import_)
~/miniconda3/envs/majoanalysis/lib/python3.8/importlib/_bootstrap.py in _load_unlocked(spec)
~/miniconda3/envs/majoanalysis/lib/python3.8/importlib/_bootstrap_external.py in exec_module(self, module)
~/miniconda3/envs/majoanalysis/lib/python3.8/importlib/_bootstrap.py in _call_with_frames_removed(f, *args, **kwds)
~/miniconda3/envs/majoanalysis/lib/python3.8/site-packages/adlfs/__init__.py in <module>
----> 1 from .spec import AzureDatalakeFileSystem
2 from .spec import AzureBlobFileSystem, AzureBlobFile
3 from ._version import get_versions
4
5 __all__ = ["AzureBlobFileSystem", "AzureBlobFile", "AzureDatalakeFileSystem"]
~/miniconda3/envs/majoanalysis/lib/python3.8/site-packages/adlfs/spec.py in <module>
18 from azure.storage.blob._models import BlobBlock, BlobProperties
19 from fsspec import AbstractFileSystem
---> 20 from fsspec.asyn import (
21 sync,
22 AsyncFileSystem,
~/miniconda3/envs/majoanalysis/lib/python3.8/site-packages/fsspec/asyn.py in <module>
6 import threading
7
----> 8 from .utils import other_paths
9 from .spec import AbstractFileSystem
10
ImportError: cannot import name 'other_paths' from 'fsspec.utils' (/gscratch/home/a-banijh/miniconda3/envs/majoanalysis/lib/python3.8/site-packages/fsspec/utils.py)
edit: updating to master fsspec brings back the old error message.
edit2: The error does not occur with v0.4.0!
from adlfs.
You able to provide a minimal example of what the file is? I see it's a .nc
but curious how it was created e.g. share xr.show_versions()
. In fact can you test a simple .nc
file? seeing libraries like h5netcdf
in the Traceback makes this hard to debug.
from adlfs.
I may have found the bug. Can you test it with the branch labeled "readinto_branch"?
from adlfs.
This worked for me ok:
$ conda create -n test_env python=3.8
$ conda activate test_env
$ pip install xarray adlfs scipy
$ python
>>> import xarray as xr
>>> import adlfs
>>> import fsspec
>>>
>>> fs = adlfs.AzureBlobFileSystem(account_name='ACCOUNT_NAME', account_key='ACCOUNT_KEY')
>>> da = xr.DataArray(1)
>>> da.to_netcdf('da.nc')
>>> fs.put('da.nc', 'tmp/da.nc')
>>>
>>> url = "abfs://tmp/da.nc"
>>> STORAGE_OPTIONS = {'account_name': 'ACCOUNT_NAME', 'account_key': 'ACCOUNT_KEY'}
>>> with fsspec.open(url, **STORAGE_OPTIONS) as f:
... ds = xr.open_dataset(f)
>>> ds.__xarray_dataarray_variable__.values
array(1)
from adlfs.
@hayesgb, yes, I can confirm that the code of thereadinto_branch
works for me too! 🎉
from adlfs.
Fixed in release v0.5.1
from adlfs.
@hayesgb, thanks for the quick fix! 🎉
from adlfs.
@hayesgb, from time to time I still get this error:
WARNING:azure.core.pipeline.policies._distributed_tracing:Unable to start network span: maximum recursion depth exceeded in __instancecheck__
ERROR:aiohttp.internal:Exception in eof callback
Traceback (most recent call last):
File "/gscratch/home/a-banijh/miniconda3/envs/majoanalysis/lib/python3.8/site-packages/aiohttp/streams.py", line 164, in on_eof
callback()
File "/gscratch/home/a-banijh/miniconda3/envs/majoanalysis/lib/python3.8/site-packages/aiohttp/client_reqrep.py", line 897, in _response_eof
self._connection.release()
File "/gscratch/home/a-banijh/miniconda3/envs/majoanalysis/lib/python3.8/site-packages/aiohttp/connector.py", line 177, in release
self._connector._release(
File "/gscratch/home/a-banijh/miniconda3/envs/majoanalysis/lib/python3.8/site-packages/aiohttp/connector.py", line 622, in _release
self._release_acquired(key, protocol)
File "/gscratch/home/a-banijh/miniconda3/envs/majoanalysis/lib/python3.8/site-packages/aiohttp/connector.py", line 608, in _release_acquired
self._drop_acquired_per_host(key, proto)
File "/gscratch/home/a-banijh/miniconda3/envs/majoanalysis/lib/python3.8/site-packages/aiohttp/connector.py", line 365, in _drop_acquired_per_host
if key not in acquired_per_host:
File "<attrs generated hash aiohttp.client_reqrep.ConnectionKey>", line 2, in __hash__
return hash((
RecursionError: maximum recursion depth exceeded while calling a Python object
ERROR:aiohttp.internal:Exception in eof callback
Traceback (most recent call last):
File "/gscratch/home/a-banijh/miniconda3/envs/majoanalysis/lib/python3.8/site-packages/aiohttp/streams.py", line 164, in on_eof
callback()
File "/gscratch/home/a-banijh/miniconda3/envs/majoanalysis/lib/python3.8/site-packages/aiohttp/client_reqrep.py", line 897, in _response_eof
self._connection.release()
File "/gscratch/home/a-banijh/miniconda3/envs/majoanalysis/lib/python3.8/site-packages/aiohttp/connector.py", line 179, in release
should_close=self._protocol.should_close)
File "/gscratch/home/a-banijh/miniconda3/envs/majoanalysis/lib/python3.8/site-packages/aiohttp/client_proto.py", line 48, in should_close
not self._payload.is_eof() or self._upgraded):
RecursionError: maximum recursion depth exceeded
Not sure if it is the same. And I cannot easily reproduce it.
from adlfs.
Related Issues (20)
- `find` doesn't accept `maxdepth` parameter HOT 1
- Add use_emulator setting to better align with object_store crate HOT 1
- Current state of the library, milestones and current development HOT 1
- Concurrent download of multiple files HOT 1
- Support virtual directory stubs with uppercase "Hdi_isfolder" metadata HOT 1
- Feature Suggestion: Optional content type when for writing file HOT 2
- Support passing url in AzureBlobFileSystem HOT 1
- Add comment why `aiohttp` is required
- Fix typo in repo About
- Python 3.12 support blocked by aiohttp HOT 1
- Feature Request: Support for Adding Metadata to Blobs
- Runtime warning from missing await HOT 2
- `fs.info()` and `fs.ls(detail=True)` return different etag formats
- Issue with parallel uploads to the same blob
- Can I use a bearer token / entra ID token for authentication? HOT 1
- Parameter anon ignored if set to False
- exists() is missing **kwargs
- object NoneType can't be used in 'await' expression HOT 3
- Does adlfs provide a way to set the content-type when writing a file to azure blob storage? HOT 1
- Does adlfs provide a way to get a file's public cloud url?
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from adlfs.