GithubHelp home page GithubHelp logo

pytroll / pytroll-examples Goto Github PK

View Code? Open in Web Editor NEW
76.0 76.0 35.0 65.25 MB

Collection of examples for pytroll satellite data processing

License: GNU General Public License v3.0

Jupyter Notebook 99.26% HTML 0.74% Shell 0.01%
python

pytroll-examples's Introduction

This is the sandbox area for the pytroll project, an international cooperation 
on a future distributed real-time processing system for Meteorological Satellite
Data.



------------------------------------
December 2010

Lars Ørum Rasmussen, Esben Sigård Nielsen, Kristian Rune Larsen, 
Martin Raspaud, Anna Geidne, Adam Dybbroe.

Danish Meteorological Institute (DMI)
Swedish Meteorological and Hydrological Institute (SMHI)

pytroll-examples's People

Contributors

adybbroe avatar benblob688 avatar benr0 avatar djhoese avatar gerritholl avatar mitkin avatar mraspaud avatar pnuu avatar roquetp avatar sfinkens avatar sjoro avatar zxdawn avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

pytroll-examples's Issues

Add CI travis jobs using treon

The project treon (https://github.com/ReviewNB/treon) was recently brought to my attention. This project would allow us to more easily run notebooks as "tests" and know how many succeeded and how many failed. If we create travis jobs that are getting the new versions of pytroll packages (satpy, pyspectral, etc) and run them on cron (every week?) this would really help show us where examples are failing and are out of date.

Would also be good for testing dependency changes that break satpy usage but may be out-of-scope for our packages to test specifically (ex. creating cartopy plots from satpy data).

Add a notebook about different resampling methods

Add a notebook that demonstrates all the different resampling methods available in Pytroll. For each method, show how to use them directly with Pyresample, and also the corresponding call via Satpy.

Possible example to demonstrate manipulating data

My group and I are learning SatPy and one of our requirements were to manipulate HRIT data with some limb-correcting schemes. It took us several hours to figure out that we needed to satpy.get to access the data array. Would it be possible to include an example (any satellite platform) that can help demonstrate how to access and change the data array using satpy.get, etc.?

FY4A_agri_introduction(zh-CN).ipynb need agri_l1_4000m_geo, how can i do ?

When I run:
filenames = glob.glob('F:/FY4A/AGRI/20220701/FY4A-_AGRI--_N_DISK_1047E_L1-_FDI-_MULT_NOM_20220701060000_20220701061459_4000M_V0001.HDF')
scn = Scene(filenames, reader='agri_fy4a_l1')
composite = 'true_color'
scn.load([composite])

there are two problem, what can improve it ?
Required file type 'agri_l1_4000m_geo' not found or loaded for 'satellite_zenith_angle'
Required file type 'agri_l1_4000m_geo' not found or loaded for 'solar_zenith_angle'
Required file type 'agri_l1_4000m_geo' not found or loaded for 'solar_azimuth_angle'
Required file type 'agri_l1_4000m_geo' not found or loaded for 'satellite_azimuth_angle'

ProxyError: HTTPSConnectionPool(host='zenodo.org', port=443): Max retries exceeded with url: /record/1288441/files/pyspectral_atm_correction_luts_no_aerosol.tgz (Caused by ProxyError('Your proxy appears to only use HTTP and not HTTPS, try changing your proxy URL to be HTTP. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#https-proxy-error-http-proxy', SSLError(SSLError(1, '[SSL: WRONG_VERSION_NUMBER] wrong version number (_ssl.c:1131)'))))

coastline/borders

Hi, i would like some help. i want to add coastline or/and country bordes in my scenes from netCDF data.i could get it with pycoast onr geoviews
any help or simple code
Thanks in advance

Unable to create a true color image

I tried to make true color from the standard sunflower data using himawari standerd data (HSD) and generate true color RGB. But I can't this. Here is the code.

rgbname = 'true_color'

scn.load([rgbname])

[DEBUG: 2019-10-06 14:06:34 : satpy.node] Skipping optional solar_zenith_angle: Unknown dataset solar_zenith_angle
[DEBUG: 2019-10-06 14:06:34 : satpy.node] Skipping optional satellite_azimuth_angle: Unknown dataset satellite_azimuth_angle
[DEBUG: 2019-10-06 14:06:34 : satpy.node] Skipping optional satellite_zenith_angle: Unknown dataset satellite_zenith_angle
[DEBUG: 2019-10-06 14:06:34 : satpy.node] Skipping optional solar_azimuth_angle: Unknown dataset solar_azimuth_angle
[DEBUG: 2019-10-06 14:06:34 : satpy.node] Skipping optional solar_zenith_angle: Unknown dataset solar_zenith_angle
[DEBUG: 2019-10-06 14:06:34 : satpy.node] Skipping optional solar_zenith_angle: Unknown dataset solar_zenith_angle
[DEBUG: 2019-10-06 14:06:34 : satpy.node] Skipping optional satellite_azimuth_angle: Unknown dataset satellite_azimuth_angle
[DEBUG: 2019-10-06 14:06:34 : satpy.node] Skipping optional satellite_zenith_angle: Unknown dataset satellite_zenith_angle
[DEBUG: 2019-10-06 14:06:34 : satpy.node] Skipping optional solar_azimuth_angle: Unknown dataset solar_azimuth_angle
[DEBUG: 2019-10-06 14:06:34 : satpy.node] Skipping optional solar_zenith_angle: Unknown dataset solar_zenith_angle
[DEBUG: 2019-10-06 14:06:34 : satpy.node] Skipping optional solar_zenith_angle: Unknown dataset solar_zenith_angle
[DEBUG: 2019-10-06 14:06:34 : satpy.node] Skipping optional solar_zenith_angle: Unknown dataset solar_zenith_angle
[DEBUG: 2019-10-06 14:06:34 : satpy.node] Skipping optional satellite_azimuth_angle: Unknown dataset satellite_azimuth_angle
[DEBUG: 2019-10-06 14:06:34 : satpy.node] Skipping optional satellite_zenith_angle: Unknown dataset satellite_zenith_angle
[DEBUG: 2019-10-06 14:06:34 : satpy.node] Skipping optional solar_azimuth_angle: Unknown dataset solar_azimuth_angle
[DEBUG: 2019-10-06 14:06:34 : satpy.node] Skipping optional solar_zenith_angle: Unknown dataset solar_zenith_angle
[DEBUG: 2019-10-06 14:06:34 : ahi_hsd] Band number = 2
[DEBUG: 2019-10-06 14:06:34 : ahi_hsd] Time_interval: 2019-10-02 00:50:20.332751 - 2019-10-02 00:50:53.688037
[DEBUG: 2019-10-06 14:06:34 : ahi_hsd] Reading time 0:00:00.007624
[DEBUG: 2019-10-06 14:06:34 : ahi_hsd] Calibration time 0:00:00.002801
[DEBUG: 2019-10-06 14:06:34 : ahi_hsd] Band number = 2
[DEBUG: 2019-10-06 14:06:34 : ahi_hsd] Time_interval: 2019-10-02 00:50:53.688037 - 2019-10-02 00:51:40.372615
[DEBUG: 2019-10-06 14:06:34 : ahi_hsd] Reading time 0:00:00.006510
[DEBUG: 2019-10-06 14:06:34 : ahi_hsd] Calibration time 0:00:00.004917
[DEBUG: 2019-10-06 14:06:34 : ahi_hsd] Band number = 2
[DEBUG: 2019-10-06 14:06:34 : ahi_hsd] Time_interval: 2019-10-02 00:51:40.372615 - 2019-10-02 00:52:37.641198
[DEBUG: 2019-10-06 14:06:34 : ahi_hsd] Reading time 0:00:00.006763
[DEBUG: 2019-10-06 14:06:34 : ahi_hsd] Calibration time 0:00:00.002606
[DEBUG: 2019-10-06 14:06:34 : ahi_hsd] Band number = 2
[DEBUG: 2019-10-06 14:06:34 : ahi_hsd] Time_interval: 2019-10-02 00:52:37.641198 - 2019-10-02 00:54:03.914060
...

TypeError Traceback (most recent call last) <ipython-input-65-871a5986e45a> in <module> ----> 1 scn.load(['true_color'])

~/anaconda3/lib/python3.7/site-packages/satpy/scene.py in load(self, wishlist, calibration, resolution, polarization, level, generate, unload, **kwargs) 967 self.read(**kwargs) 968 if generate: --> 969 keepables = self.generate_composites() 970 else: 971 # don't lose datasets we loaded to try to generate composites

~/anaconda3/lib/python3.7/site-packages/satpy/scene.py in generate_composites(self, nodes) 882 nodes = set(self.dep_tree.trunk(nodes=required_nodes)) - \ 883 set(self.datasets.keys()) --> 884 return self._read_composites(nodes) 885 886 def _remove_failed_datasets(self, keepables):

~/anaconda3/lib/python3.7/site-packages/satpy/scene.py in _read_composites(self, compositor_nodes) 856 keepables = set() 857 for item in compositor_nodes: --> 858 self._generate_composite(item, keepables) 859 return keepables 860

~/anaconda3/lib/python3.7/site-packages/satpy/scene.py in _generate_composite(self, comp_node, keepables) 831 composite = compositor(prereq_datasets, 832 optional_datasets=optional_datasets, --> 833 **self.attrs) 834 835 cid = DatasetID.from_dict(composite.attrs)

~/anaconda3/lib/python3.7/site-packages/satpy/composites/__init__.py in __call__(self, projectables, **info) 402 if self.max_sza is not None: 403 coszen = coszen.where(coszen >= self.max_sza_cos) --> 404 self.coszen[key] = coszen 405 elif coszen is None: 406 # we were given the SZA, calculate the cos(SZA)

~/anaconda3/lib/python3.7/weakref.py in __setitem__(self, key, value) 166 if self._pending_removals: 167 self._commit_removals() --> 168 self.data[key] = KeyedRef(value, self._remove, key) 169 170 def copy(self):

~/anaconda3/lib/python3.7/weakref.py in __new__(type, ob, callback, key) 335 336 def __new__(type, ob, callback, key): --> 337 self = ref.__new__(type, ob, callback) 338 self.key = key 339 return self

TypeError: cannot create weak reference to 'DataArray' object

However, setting "rgbname" to "true_color_raw" will create a composite image.

Example suggestion: Remap data to Robinson projection starting with cartopy area

Here is an example of remapping data to Robinson projection.
Using the cartopy definition to get the border around the globe correct.
A pyresample area definition is created from the cartopy definition, for remapping.
There is also a fix to remove data outside the globe borders, which is most likely caused by
a bug in proj or maybe pyproj. I am hoping to get time to turn it into a notebook.

import matplotlib.pyplot as plt
from pyresample import geometry as prg
import numpy as np
from pyresample.kd_tree import resample_nearest
import copy
import matplotlib
from pyresample import load_area

# Define area from cartopy to get globe boarders
crs = ccrs.Robinson()
crs.bounds = (crs.x_limits[0], crs.x_limits[1], 
              crs.y_limits[0], crs.y_limits[1])

# Create pyresample area_def object for resampling
area_def = prg.AreaDefinition('robinson',
                              'robinson',
                              'robinson',
                              projection=crs.proj4_params,
                              width=1000, height=500, 
                              area_extent=(crs.x_limits[0],
                                           crs.y_limits[0],
                                           crs.x_limits[1],
                                           crs.y_limits[1]))

# Make test data
xi = np.linspace(-179, 179, 1000)
yi = np.linspace(-90, 90, 500)
lons, lats = np.meshgrid(xi, yi)
data = lons*6.0 + lats*3.0

# Remap to robinson projection
swath_def = prg.SwathDefinition(lons=lons, lats=lats)
result = resample_nearest(
    swath_def, data, area_def,
    radius_of_influence=500*1000*2.5, fill_value=None)

# Create colormap with nan's not visible
my_cmap = copy.copy(matplotlib.cm.BrBG)
my_cmap.set_bad('1.0', alpha=0)
my_cmap.set_over('red', alpha=1)

# Hack to remove data in corners
lon, lat = area_def.get_lonlats()
yshape, xshape = result.data.shape
result.data[:,0:xshape//2][lon[:,0:xshape//2]>0] = np.nan
result.data[:,xshape//2:][lon[:,xshape//2:]<0] = np.nan

# Plot image with repeated data in corners masked
fig = plt.figure()
ax = fig.subplots(nrows=1, subplot_kw={'projection': crs})
ax.coastlines()
ax.imshow(result, transform=crs, extent=crs.bounds, cmap=my_cmap)
plt.savefig('test_robinson.png',  bbox_inches='tight')
plt.show()

ValueError: No reader(s) named: ['safe_msi']

I I try to open Satellite data from the Sentinel2 with pysat.
Im am using satpy version 0.20.0 installed with anaconda channel conda-forge on windows.

When I use:

files = find_files_and_readers(base_dir=path, reader='safe_msi')

I get the error:

ValueError: No reader(s) named: ['safe_msi']

Is the reader name outdated? It it the folder? I Almost tried any level of subfolders...

I tried it out with data of the processing level 1C and 2A.

outdated sentinel-1 example

I'm very new to satpy, and I'm trying to use sentinel-1 data.

sentinel1-false-color is outdated, and I am not not able to reproduce the example. I'm using satpy 0.28.1 from conda-forge.

I dont understand what arguments I have to provide to snc.load. Here is my attempt:

from satpy import available_readers, Scene, find_files_and_readers
import datetime

my_files = find_files_and_readers(base_dir='/windows_shared/tmp',
                                  reader='sar-c_safe',
                                  start_time=datetime.datetime(2018, 10, 13, 6, 0, 0),
                                  end_time=datetime.datetime(2018, 10, 13, 7, 30, 0))

scn = Scene(reader='sar-c_safe', filenames=my_files)

scn.load(scn.available_dataset_ids())
AttributeError: 'SAFEXML' object has no attribute 'read_xml_array'

Could not load dataset 'DataID(name='latitude', polarization='hh', resolution=80, modifiers=())': "Could not load DataID(name='latitude', polarization='hh', resolution=80, modifiers=()) from any provided files"
Traceback (most recent call last):
  File "/home/oarcher/anaconda3/envs/xsar/lib/python3.9/site-packages/satpy/readers/yaml_reader.py", line 841, in _load_dataset_with_area
    ds = self._load_dataset_data(file_handlers, dsid, **kwargs)
  File "/home/oarcher/anaconda3/envs/xsar/lib/python3.9/site-packages/satpy/readers/yaml_reader.py", line 713, in _load_dataset_data
    proj = self._load_dataset(dsid, ds_info, file_handlers, **kwargs)
  File "/home/oarcher/anaconda3/envs/xsar/lib/python3.9/site-packages/satpy/readers/yaml_reader.py", line 698, in _load_dataset
    raise KeyError(
KeyError: "Could not load DataID(name='latitude', polarization='hh', resolution=80, modifiers=()) from any provided files"
Could not load dataset 'DataID(name='latitude', polarization='hv', resolution=80, modifiers=())': "Could not load DataID(name='latitude', polarization='hv', resolution=80, modifiers=()) from any provided files"
Traceback (most recent call last):
  File "/home/oarcher/anaconda3/envs/xsar/lib/python3.9/site-packages/satpy/readers/yaml_reader.py", line 841, in _load_dataset_with_area
    ds = self._load_dataset_data(file_handlers, dsid, **kwargs)
  File "/home/oarcher/anaconda3/envs/xsar/lib/python3.9/site-packages/satpy/readers/yaml_reader.py", line 713, in _load_dataset_data
    proj = self._load_dataset(dsid, ds_info, file_handlers, **kwargs)
  File "/home/oarcher/anaconda3/envs/xsar/lib/python3.9/site-packages/satpy/readers/yaml_reader.py", line 698, in _load_dataset
    raise KeyError(
KeyError: "Could not load DataID(name='latitude', polarization='hv', resolution=80, modifiers=()) from any provided files"
Could not load dataset 'DataID(name='longitude', polarization='hh', resolution=80, modifiers=())': "Could not load DataID(name='longitude', polarization='hh', resolution=80, modifiers=()) from any provided files"
Traceback (most recent call last):
  File "/home/oarcher/anaconda3/envs/xsar/lib/python3.9/site-packages/satpy/readers/yaml_reader.py", line 841, in _load_dataset_with_area
    ds = self._load_dataset_data(file_handlers, dsid, **kwargs)
  File "/home/oarcher/anaconda3/envs/xsar/lib/python3.9/site-packages/satpy/readers/yaml_reader.py", line 713, in _load_dataset_data
    proj = self._load_dataset(dsid, ds_info, file_handlers, **kwargs)
  File "/home/oarcher/anaconda3/envs/xsar/lib/python3.9/site-packages/satpy/readers/yaml_reader.py", line 698, in _load_dataset
    raise KeyError(
KeyError: "Could not load DataID(name='longitude', polarization='hh', resolution=80, modifiers=()) from any provided files"
Could not load dataset 'DataID(name='longitude', polarization='hv', resolution=80, modifiers=())': "Could not load DataID(name='longitude', polarization='hv', resolution=80, modifiers=()) from any provided files"
Traceback (most recent call last):
  File "/home/oarcher/anaconda3/envs/xsar/lib/python3.9/site-packages/satpy/readers/yaml_reader.py", line 841, in _load_dataset_with_area
    ds = self._load_dataset_data(file_handlers, dsid, **kwargs)
  File "/home/oarcher/anaconda3/envs/xsar/lib/python3.9/site-packages/satpy/readers/yaml_reader.py", line 713, in _load_dataset_data
    proj = self._load_dataset(dsid, ds_info, file_handlers, **kwargs)
  File "/home/oarcher/anaconda3/envs/xsar/lib/python3.9/site-packages/satpy/readers/yaml_reader.py", line 698, in _load_dataset
    raise KeyError(
KeyError: "Could not load DataID(name='longitude', polarization='hv', resolution=80, modifiers=()) from any provided files"
Could not load dataset 'DataID(name='gamma_squared', polarization='hh', resolution=80, modifiers=())': "Could not load DataID(name='gamma_squared', polarization='hh', resolution=80, modifiers=()) from any provided files"
Traceback (most recent call last):
  File "/home/oarcher/anaconda3/envs/xsar/lib/python3.9/site-packages/satpy/readers/yaml_reader.py", line 841, in _load_dataset_with_area
    ds = self._load_dataset_data(file_handlers, dsid, **kwargs)
  File "/home/oarcher/anaconda3/envs/xsar/lib/python3.9/site-packages/satpy/readers/yaml_reader.py", line 713, in _load_dataset_data
    proj = self._load_dataset(dsid, ds_info, file_handlers, **kwargs)
  File "/home/oarcher/anaconda3/envs/xsar/lib/python3.9/site-packages/satpy/readers/yaml_reader.py", line 698, in _load_dataset
    raise KeyError(
KeyError: "Could not load DataID(name='gamma_squared', polarization='hh', resolution=80, modifiers=()) from any provided files"
Could not load dataset 'DataID(name='measurement', polarization='hv', resolution=80, calibration=<calibration.gamma>, quantity=<quantity.dB>, modifiers=())': "Could not load DataID(name='measurement', polarization='hv', resolution=80, calibration=<calibration.gamma>, quantity=<quantity.dB>, modifiers=()) from any provided files"
Traceback (most recent call last):
  File "/home/oarcher/anaconda3/envs/xsar/lib/python3.9/site-packages/satpy/readers/yaml_reader.py", line 841, in _load_dataset_with_area
    ds = self._load_dataset_data(file_handlers, dsid, **kwargs)
  File "/home/oarcher/anaconda3/envs/xsar/lib/python3.9/site-packages/satpy/readers/yaml_reader.py", line 713, in _load_dataset_data
    proj = self._load_dataset(dsid, ds_info, file_handlers, **kwargs)
  File "/home/oarcher/anaconda3/envs/xsar/lib/python3.9/site-packages/satpy/readers/yaml_reader.py", line 698, in _load_dataset
    raise KeyError(
KeyError: "Could not load DataID(name='measurement', polarization='hv', resolution=80, calibration=<calibration.gamma>, quantity=<quantity.dB>, modifiers=()) from any provided files"
---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
<ipython-input-7-a6a0d4796f79> in <module>
----> 1 scn.load(scn.available_dataset_ids())

~/anaconda3/envs/xsar/lib/python3.9/site-packages/satpy/scene.py in load(self, wishlist, calibration, resolution, polarization, level, generate, unload, **kwargs)
   1152         self._wishlist |= needed_datasets
   1153 
-> 1154         self._read_datasets_from_storage(**kwargs)
   1155         self.generate_possible_composites(generate, unload)
   1156 

~/anaconda3/envs/xsar/lib/python3.9/site-packages/satpy/scene.py in _read_datasets_from_storage(self, **kwargs)
   1172         """
   1173         nodes = self._dependency_tree.leaves(limit_nodes_to=self.missing_datasets)
-> 1174         return self._read_dataset_nodes_from_storage(nodes, **kwargs)
   1175 
   1176     def _read_dataset_nodes_from_storage(self, reader_nodes, **kwargs):

~/anaconda3/envs/xsar/lib/python3.9/site-packages/satpy/scene.py in _read_dataset_nodes_from_storage(self, reader_nodes, **kwargs)
   1178         # Sort requested datasets by reader
   1179         reader_datasets = self._sort_dataset_nodes_by_reader(reader_nodes)
-> 1180         loaded_datasets = self._load_datasets_by_readers(reader_datasets, **kwargs)
   1181         self._datasets.update(loaded_datasets)
   1182         return loaded_datasets

~/anaconda3/envs/xsar/lib/python3.9/site-packages/satpy/scene.py in _load_datasets_by_readers(self, reader_datasets, **kwargs)
   1203         for reader_name, ds_ids in reader_datasets.items():
   1204             reader_instance = self._readers[reader_name]
-> 1205             new_datasets = reader_instance.load(ds_ids, **kwargs)
   1206             loaded_datasets.update(new_datasets)
   1207         return loaded_datasets

~/anaconda3/envs/xsar/lib/python3.9/site-packages/satpy/readers/yaml_reader.py in load(self, dataset_keys, previous_datasets, **kwargs)
    943             coords = [all_datasets.get(cid, None)
    944                       for cid in coordinates.get(dsid, [])]
--> 945             ds = self._load_dataset_with_area(dsid, coords, **kwargs)
    946             if ds is not None:
    947                 all_datasets[dsid] = ds

~/anaconda3/envs/xsar/lib/python3.9/site-packages/satpy/readers/yaml_reader.py in _load_dataset_with_area(self, dsid, coords, **kwargs)
    839 
    840         try:
--> 841             ds = self._load_dataset_data(file_handlers, dsid, **kwargs)
    842         except (KeyError, ValueError) as err:
    843             logger.exception("Could not load dataset '%s': %s", dsid, str(err))

~/anaconda3/envs/xsar/lib/python3.9/site-packages/satpy/readers/yaml_reader.py in _load_dataset_data(self, file_handlers, dsid, **kwargs)
    711     def _load_dataset_data(self, file_handlers, dsid, **kwargs):
    712         ds_info = self.all_ids[dsid]
--> 713         proj = self._load_dataset(dsid, ds_info, file_handlers, **kwargs)
    714         # FIXME: areas could be concatenated here
    715         # Update the metadata

~/anaconda3/envs/xsar/lib/python3.9/site-packages/satpy/readers/yaml_reader.py in _load_dataset(dsid, ds_info, file_handlers, dim, **kwargs)
    687         for fh in file_handlers:
    688             try:
--> 689                 projectable = fh.get_dataset(dsid, ds_info)
    690                 if projectable is not None:
    691                     slice_list.append(projectable)

~/anaconda3/envs/xsar/lib/python3.9/site-packages/satpy/readers/sar_c_safe.py in get_dataset(self, key, info)
    131             if not data_items:
    132                 continue
--> 133             data, low_res_coords = self.read_xml_array(data_items, xml_tag)
    134 
    135         if key['name'].endswith('squared'):

AttributeError: 'SAFEXML' object has no attribute 'read_xml_array'

On-the-fly decompression of hrit files not working

Hi everyone,

i'm triyng to use the on-the fly decompression of hrit files on satpy 0.23.0 with python 3.8 and Ubuntu 18.04 LTS.

I correctly build xRITDecompress utility and setup the XRIT_DECOMPRESS_PATH environment variable with the absolute path inside, but if i try to run the following code:
satpy.find_files_and_readers(base_dir='../raw_data/hrit', start_time=datetime(2020, 10, 6, 12, 15),
end_time=datetime(2020, 10, 6, 12, 15),
reader='seviri_l1b_hrit')
i got the errror:
Traceback (most recent call last):
File "", line 1, in
File "/venv/lib/python3.8/site-packages/satpy/readers/init.py", line 430, in find_files_and_readers
raise ValueError("No supported files found")
ValueError: No supported files found
Hrit folder listing:
H-000-MSG4__-MSG4________-WV_073___-000007___-202010061215-C_
H-000-MSG4__-MSG4________-HRV______-000014___-202010061215-C_
H-000-MSG4__-MSG4________-IR_087___-000005___-202010061215-C_
H-000-MSG4__-MSG4________-IR_120___-000004___-202010061215-C_
H-000-MSG4__-MSG4________-IR_108___-000008___-202010061215-C_
H-000-MSG4__-MSG4________-WV_062___-000003___-202010061215-C_
H-000-MSG4__-MSG4________-HRV______-000006___-202010061215-C_
H-000-MSG4__-MSG4________-IR_120___-000002___-202010061215-C_
H-000-MSG4__-MSG4________-IR_016___-000004___-202010061215-C_
H-000-MSG4__-MSG4________-VIS008___-000002___-202010061215-C_
H-000-MSG4__-MSG4________-VIS008___-000004___-202010061215-C_
H-000-MSG4__-MSG4________-IR_097___-000006___-202010061215-C_
H-000-MSG4__-MSG4________-HRV______-000010___-202010061215-C_
H-000-MSG4__-MSG4________-HRV______-000024___-202010061215-C_
H-000-MSG4__-MSG4________-HRV______-000019___-202010061215-C_
H-000-MSG4__-MSG4________-WV_073___-000002___-202010061215-C_
H-000-MSG4__-MSG4________-VIS006___-000007___-202010061215-C_
H-000-MSG4__-MSG4________-IR_097___-000001___-202010061215-C_
H-000-MSG4__-MSG4________-IR_039___-000006___-202010061215-C_
H-000-MSG4__-MSG4________-WV_062___-000007___-202010061215-C_
H-000-MSG4__-MSG4________-VIS006___-000006___-202010061215-C_
H-000-MSG4__-MSG4________-HRV______-000004___-202010061215-C_
H-000-MSG4__-MSG4________-IR_097___-000002___-202010061215-C_
H-000-MSG4__-MSG4________-WV_062___-000004___-202010061215-C_
H-000-MSG4__-MSG4________-IR_039___-000008___-202010061215-C_
H-000-MSG4__-MSG4________-IR_120___-000001___-202010061215-C_
H-000-MSG4__-MSG4________-VIS008___-000005___-202010061215-C_
H-000-MSG4__-MSG4________-IR_134___-000001___-202010061215-C_
H-000-MSG4__-MSG4________-IR_016___-000005___-202010061215-C_
H-000-MSG4__-MSG4________-IR_097___-000008___-202010061215-C_
H-000-MSG4__-MSG4________-IR_134___-000006___-202010061215-C_
H-000-MSG4__-MSG4________-HRV______-000017___-202010061215-C_
H-000-MSG4__-MSG4________-IR_134___-000005___-202010061215-C_
H-000-MSG4__-MSG4________-VIS006___-000002___-202010061215-C_
H-000-MSG4__-MSG4________-VIS006___-000005___-202010061215-C_
H-000-MSG4__-MSG4________-IR_120___-000008___-202010061215-C_
H-000-MSG4__-MSG4________-IR_120___-000005___-202010061215-C_
H-000-MSG4__-MSG4________-WV_073___-000001___-202010061215-C_
H-000-MSG4__-MSG4________-VIS006___-000008___-202010061215-C_
H-000-MSG4__-MSG4________-IR_108___-000003___-202010061215-C_
H-000-MSG4__-MSG4________-IR_016___-000006___-202010061215-C_
H-000-MSG4__-MSG4________-HRV______-000001___-202010061215-C_
H-000-MSG4__-MSG4________-WV_073___-000008___-202010061215-C_
H-000-MSG4__-MSG4________-IR_087___-000006___-202010061215-C_
H-000-MSG4__-MSG4________-IR_087___-000008___-202010061215-C_
H-000-MSG4__-MSG4________-HRV______-000007___-202010061215-C_
H-000-MSG4__-MSG4________-IR_097___-000007___-202010061215-C_
H-000-MSG4__-MSG4________-VIS008___-000003___-202010061215-C_
H-000-MSG4__-MSG4________-WV_073___-000003___-202010061215-C_
H-000-MSG4__-MSG4________-IR_016___-000002___-202010061215-C_
H-000-MSG4__-MSG4________-WV_062___-000001___-202010061215-C_
H-000-MSG4__-MSG4________-VIS008___-000008___-202010061215-C_
H-000-MSG4__-MSG4________-WV_062___-000005___-202010061215-C_
H-000-MSG4__-MSG4________-IR_039___-000002___-202010061215-C_
H-000-MSG4__-MSG4________-IR_087___-000002___-202010061215-C_
H-000-MSG4__-MSG4________-HRV______-000018___-202010061215-C_
H-000-MSG4__-MSG4________-IR_039___-000004___-202010061215-C_
H-000-MSG4__-MSG4________-IR_134___-000004___-202010061215-C_
H-000-MSG4__-MSG4________-IR_108___-000002___-202010061215-C_
H-000-MSG4__-MSG4________-IR_108___-000005___-202010061215-C_
H-000-MSG4__-MSG4________-HRV______-000013___-202010061215-C_
H-000-MSG4__-MSG4________-HRV______-000012___-202010061215-C_
H-000-MSG4__-MSG4________-IR_087___-000003___-202010061215-C_
H-000-MSG4__-MSG4________-WV_073___-000006___-202010061215-C_
H-000-MSG4__-MSG4________-IR_087___-000004___-202010061215-C_
H-000-MSG4__-MSG4________-WV_062___-000002___-202010061215-C_
H-000-MSG4__-MSG4________-HRV______-000016___-202010061215-C_
H-000-MSG4__-MSG4________-HRV______-000005___-202010061215-C_
H-000-MSG4__-MSG4________-IR_039___-000007___-202010061215-C_
H-000-MSG4__-MSG4________-IR_108___-000004___-202010061215-C_
H-000-MSG4__-MSG4________-IR_134___-000002___-202010061215-C_
H-000-MSG4__-MSG4________-VIS008___-000006___-202010061215-C_
H-000-MSG4__-MSG4________-VIS006___-000004___-202010061215-C_
H-000-MSG4__-MSG4________-IR_087___-000001___-202010061215-C_
H-000-MSG4__-MSG4________-IR_016___-000007___-202010061215-C_
H-000-MSG4__-MSG4________-VIS008___-000007___-202010061215-C_
H-000-MSG4__-MSG4________-HRV______-000002___-202010061215-C_
H-000-MSG4__-MSG4________-WV_062___-000006___-202010061215-C_
H-000-MSG4__-MSG4________-HRV______-000008___-202010061215-C_
H-000-MSG4__-MSG4________-IR_097___-000004___-202010061215-C_
H-000-MSG4__-MSG4________-HRV______-000009___-202010061215-C_
H-000-MSG4__-MSG4________-HRV______-000011___-202010061215-C_
H-000-MSG4__-MSG4________-IR_108___-000001___-202010061215-C_
H-000-MSG4__-MSG4________-IR_134___-000003___-202010061215-C_
H-000-MSG4__-MSG4________-WV_062___-000008___-202010061215-C_
H-000-MSG4__-MSG4________-IR_108___-000006___-202010061215-C_
H-000-MSG4__-MSG4________-HRV______-000023___-202010061215-C_
H-000-MSG4__-MSG4________-HRV______-000021___-202010061215-C_
H-000-MSG4__-MSG4________-VIS006___-000003___-202010061215-C_
H-000-MSG4__-MSG4________-IR_134___-000007___-202010061215-C_
H-000-MSG4__-MSG4________-HRV______-000015___-202010061215-C_
H-000-MSG4__-MSG4________-IR_108___-000007___-202010061215-C_
H-000-MSG4__-MSG4________-IR_039___-000005___-202010061215-C_
H-000-MSG4__-MSG4________-IR_016___-000001___-202010061215-C_
H-000-MSG4__-MSG4________--PRO-202010061215-
H-000-MSG4
_-MSG4________-IR_039___-000001___-202010061215-C_
H-000-MSG4__-MSG4________-IR_097___-000005___-202010061215-C_
H-000-MSG4__-MSG4________-HRV______-000022___-202010061215-C_
H-000-MSG4__-MSG4________--EPI-202010061215-
H-000-MSG4
_-MSG4________-IR_120___-000006___-202010061215-C_
H-000-MSG4__-MSG4________-HRV______-000020___-202010061215-C_
H-000-MSG4__-MSG4________-IR_087___-000007___-202010061215-C_
H-000-MSG4__-MSG4________-IR_016___-000003___-202010061215-C_
H-000-MSG4__-MSG4________-IR_097___-000003___-202010061215-C_
H-000-MSG4__-MSG4________-IR_120___-000003___-202010061215-C_
H-000-MSG4__-MSG4________-IR_134___-000008___-202010061215-C_
H-000-MSG4__-MSG4________-IR_039___-000003___-202010061215-C_
H-000-MSG4__-MSG4________-VIS006___-000001___-202010061215-C_
H-000-MSG4__-MSG4________-IR_120___-000007___-202010061215-C_
H-000-MSG4__-MSG4________-VIS008___-000001___-202010061215-C_
H-000-MSG4__-MSG4________-HRV______-000003___-202010061215-C_
H-000-MSG4__-MSG4________-IR_016___-000008___-202010061215-C_
H-000-MSG4__-MSG4________-WV_073___-000005___-202010061215-C_
H-000-MSG4__-MSG4________-WV_073___-000004___-202010061215-C_

Thank you in advance.
Simone

Refactor binder environment using nbgitpuller

See https://discourse.jupyter.org/t/tip-embed-custom-github-content-in-a-binder-link-with-nbgitpuller/922

Basically the pytroll-examples would become the "content" repository and we'd have a separate one that acts as the "environment" repository. The README in pytroll-examples would only need to point to the right URL. The nice thing about this is that any new tutorials or changes to this repository won't require rebuilding the binderhub image; only changes to the environment repository will do that.

Feature requestion: add example of scatter plot to Cartopy Plot.ipynb

After some time I worked out how to create a scatter plot onto of a satpy/cartopy plot.

I know i'm not the first person who has come across this as it has been brought up before (https://pytroll.slack.com/archives/C06GJFRN0/p1588754633277200).

The cartopy plot notebook https://github.com/pytroll/pytroll-examples/blob/master/satpy/Cartopy%20Plot.ipynb could have an extra cell demonstrating the use of crs and ccrs.PlateCarree()

For example here's a cartopy plot with a scatter plot using the hurricane Florence example:

from satpy import Scene, demo
import matplotlib.pyplot as plt
import cartopy.crs as ccrs

filenames = demo.get_hurricane_florence_abi(num_frames=1)
scn = Scene(reader='abi_l1b', filenames=filenames)
scn.load(['C01'])

new_scn = scn.resample(scn['C01'].attrs['area'])
crs = new_scn['C01'].attrs['area'].to_cartopy_crs()

ax = plt.axes(projection=crs)

ax.coastlines()
ax.gridlines(draw_labels=True)
ax.set_global()

plt.imshow(new_scn['C01'], transform=crs, extent=crs.bounds, origin='upper')

ax.scatter(-65, 27, marker='o', transform=ccrs.PlateCarree(), color='red', s=50)

plt.show()

image

Can't clone repository due to GitHub LFS limits

Clicking on http://binder.pangeo.io/v2/gh/pytroll/pytroll-examples/master

gives

Waiting for build to start...
Picked Git content provider.
Cloning into '/tmp/repo2dockeriatlrs7h'...
Downloading pyspectral/meteosat09_20150420_1000_snow_rgb.png (501 KB)
Error downloading object: pyspectral/meteosat09_20150420_1000_snow_rgb.png (df478b0): Smudge error: Error downloading pyspectral/meteosat09_20150420_1000_snow_rgb.png (df478b0d5729763467b306672d2dc178dafe6bf8a81a7182cfaa629f2e09e179): batch response: Git LFS is disabled for this repository.

Errors logged to /tmp/repo2dockeriatlrs7h/.git/lfs/logs/20200809T011053.553152168.log
Use `git lfs logs last` to view the log.
error: external filter 'git-lfs filter-process' failed
fatal: pyspectral/meteosat09_20150420_1000_snow_rgb.png: smudge filter lfs failed
Error during build: Command '['git', 'reset', '--hard', '832949eadf74e5b392f9c3197c492a87b8205155']' returned non-zero exit status 128.

Looks like something to do with Git LFS
Not sure if this would help https://filesystem-spec.readthedocs.io/en/latest/api.html#id0

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.