GithubHelp home page GithubHelp logo

wfp-vam / modape Goto Github PK

View Code? Open in Web Editor NEW
28.0 5.0 8.0 12.81 MB

MODIS Assimilation and Processing Engine

Home Page: https://wfp-vam.github.io/modape/

License: MIT License

Python 92.54% Dockerfile 0.38% Cython 7.09%
remote-sensing modis whittaker smoothing cloudfree filter satellite modis-data modis-land-products time-series

modape's Introduction

MODAPE

CI version pyversions downloads license documentation

The MODIS Assimilation and Processing Engine combines a state-of-the art Whittaker smoother, implemented as fast C-extension through Cython and including a V-curve optimization of the smoothing parameter, with a HDF5 based processing chain optimized for MODIS data.

The sub-module modape.whittaker includes the following variations of the Whittaker smoother with 2nd order differences:

  • ws2d: Whittaker with fixed smoothing parameter (s)
  • ws2dp: Whittaker with fixed smoothing parameter (s) and expectile smoothing using asymmetric weights
  • ws2doptv: Whittaker with V-curve optimization of the smoothing parameter
  • ws2doptvp: Whittaker with V-curve optimization of the smoothing parameter and expectile smoothing using asymmetric weights

The MODIS processing chain consists of the following executables, which can be called through commandline:

  • modis_download: Query and download raw MODIS products (requires Earthdata credentials)
  • modis_collect: Collect raw MODIS data into daily datacubes stored in an HDF5 file
  • modis_smooth: Smooth, gapfill and interpolate raw MODIS data using the implemented Whittaker smoother
  • modis_window: Extract mosaic(s) of multiple MODIS tiles, or subset(s) of a global/tiled MODIS product and export it as GeoTIFF raster in WGS1984 coordinate system

Additional executables:

  • csv_smooth: Smooth timeseries stored within a CSV file
  • modis_info: Retrieve metadata from created HDF5 files

For a more information please check out the documentation!

Installation

Dependencies:

modape depends on these packages:

  • click
  • gdal
  • h5py
  • numpy
  • pandas
  • python-cmr
  • requests

Some of these packages (eg. GDAL) can be difficult to build, especially on windows machines. In the latter case it's advisable to download an unofficial binary wheel from Christoph Gohlke's Unofficial Windows Binaries for Python Extension Packages and install it locally with pip install before installing modape.

Installation from github:

$ git clone https://github.com/WFP-VAM/modape
$ cd modape
$ pip3 install .

Installation from PyPi:

$ pip3 install modape

Bugs, typos & feature requests

If you find a bug, see a typo, have some kind of troubles running the module or just simply want to have a feature added, please submit an issue!


References:

P. H. C. Eilers, V. Pesendorfer and R. Bonifacio, "Automatic smoothing of remote sensing data," 2017 9th International Workshop on the Analysis of Multitemporal Remote Sensing Images (MultiTemp), Brugge, 2017, pp. 1-3. doi: 10.1109/Multi-Temp.2017.8076705 URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=8076705&isnumber=8035194

Core Whittaker function adapted from whit2 function from R package ptw:

Bloemberg, T. G. et al. (2010) "Improved Parametric Time Warping for Proteomics", Chemometrics and Intelligent Laboratory Systems, 104 (1), 65-74

Wehrens, R. et al. (2015) "Fast parametric warping of peak lists", Bioinformatics, in press.


modape's People

Contributors

interob avatar valpesendorfer avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

modape's Issues

Handling of URL-list for aria2 download

Functionality to add:

  • make download directory or catch file not found error for MODISfiles.txt
  • generate process specific name for URL list so multiple processes can write to same directory

h5py memory leak

As previously noted in #12 there seems to be a memory leak with big arrays in chunked datasets. Unfortunately PR #15 only fixed the issues with numpy, but not h5py.

Here's more info on the h5py leak: h5py/h5py#382

This bug appears when processing longer timeseries of global files (resulting in bigger arrays) with processMODIS on a 32bit python machine.

Processing on a 64bit machine works without a problem.

Possible solutions:

  • look into changing chunked to continuous storage
  • make chunksize more flexible
  • add as known issue & wontfix

Index exception in windowMODIS

Traceback:

Processing file T:/VALENTIN/mosaictest/rgnvim2010j177.tif
Traceback (most recent call last):
File "T:\VALENTIN\wsmtk_env\Scripts\windowMODIS-script.py", line 11, in
load_entry_point('wsmtk', 'console_scripts', 'windowMODIS')()
File "t:\valentin\wsmtk\wsmtk\windowMODIS.py", line 69, in main
with mosaic.getRaster(args.dataset,ix) as mosaic_ropen:
File "C:\Python27\ArcGIS10.4\Lib\contextlib.py", line 17, in enter
return self.gen.next()
File "t:\valentin\wsmtk\wsmtk\modis.py", line 327, in getRaster
array = self.getArray(dataset,ix)
File "t:\valentin\wsmtk\wsmtk\modis.py", line 318, in getArray
arr = h5f_o.get(dataset)[...,ix]
File "h5py_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
File "h5py_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
File "t:\valentin\wsmtk_env\lib\site-packages\h5py_hl\dataset.py", line 476, in getitem
selection = sel.select(self.shape, args, dsid=self.id)
File "t:\valentin\wsmtk_env\lib\site-packages\h5py_hl\selections.py", line 94, in select
sel[args]
File "t:\valentin\wsmtk_env\lib\site-packages\h5py_hl\selections.py", line 261, in getitem
start, count, step, scalar = _handle_simple(self.shape,args)
File "t:\valentin\wsmtk_env\lib\site-packages\h5py_hl\selections.py", line 451, in _handle_simple
x,y,z = _translate_int(int(arg), length)
File "t:\valentin\wsmtk_env\lib\site-packages\h5py_hl\selections.py", line 471, in _translate_int
raise ValueError("Index (%s) out of range (0-%s)" % (exp, length-1))
ValueError: Index (24) out of range (0-23)

Error updating raw H5

In the process of ingesting NDVI one year after another, I got this error during processing of the second year:

Error updating ./VIM/MXD13A2.h19v08.006.VIM.h5! File may be corrupt, consider creating the file from scratch, or closer investigation.

Error message:

Traceback (most recent call last):
File "C:\OSGEO4~1\apps\Python37\lib\site-packages\modape\modis.py", line 472, in update
arr[..., b1:b1+self.chunks[1]] = dset[b:b+self.chunks[0], b1:b1+self.chunks[1]]
ValueError: could not broadcast input array from shape (48,10) into shape (57600,10)

Error processing product ['MXD13A2'], product code None.

Traceback:

Traceback (most recent call last):
File "D:\ARC\Documents\2019\VAM NDVI\africa_1km\scripts\modis_collect.py", line 36, in run_process
rh5.update()
File "C:\OSGEO4~1\apps\Python37\lib\site-packages\modape\modis.py", line 472, in update
arr[..., b1:b1+self.chunks[1]] = dset[b:b+self.chunks[0], b1:b1+self.chunks[1]]
ValueError: could not broadcast input array from shape (48,10) into shape (57600,10)

vp bug in ws2doptvp

running modis_smooth with --optvp and --nworkers > 1 resulted in error with traceback:

Running whittaker smoother asymmetric V-curve optimization ...

multiprocessing.pool.RemoteTraceback:
"""
Traceback (most recent call last):
File "/usr/lib/python3.6/multiprocessing/pool.py", line 119, in worker
result = (True, func(*args, **kwds))
File "/usr/lib/python3.6/multiprocessing/pool.py", line 44, in mapstar
return list(map(*args))
File "/usr/local/lib/python3.6/dist-packages/modape-0.4.0-py3.6-linux-x86_64.egg/modape/utils.py", line 608, in execute_ws2d_vc
vprec=vp)
UnboundLocalError: local variable 'vp' referenced before assignment
"""

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/usr/local/bin/modis_smooth", line 11, in
load_entry_point('modape==0.4.0', 'console_scripts', 'modis_smooth')()
File "/usr/local/lib/python3.6/dist-packages/modape-0.4.0-py3.6-linux-x86_64.egg/modape/scripts/modis_smooth.py", line 320, in main
smt_h5.ws2d_vc(args.srange, args.pvalue, constrain=args.constrain)
File "/usr/local/lib/python3.6/dist-packages/modape-0.4.0-py3.6-linux-x86_64.egg/modape/modis/smooth.py", line 761, in ws2d_vc
_ = pool.map(execute_ws2d_vc, map_index)
File "/usr/lib/python3.6/multiprocessing/pool.py", line 266, in map
return self._map_async(func, iterable, mapstar, chunksize).get()
File "/usr/lib/python3.6/multiprocessing/pool.py", line 644, in get
raise self._value
UnboundLocalError: local variable 'vp' referenced before assignment

dates & leap years

For temporal interpolation to 10-day & 5-day data, dates are kept constant which results in changing DOY in leap years.

Need to add modification to enable VAM tools working with dekadal and pentadal data.

dates in modis_window

The handling of dates in modis_window is a bit counter-intuitive and can lead to data not being extracted.

Following changes will be made:

  • Supplying of dates in modis_window becomes optional. If they are not supplied, the min temporal extent, the max temporal extent or both are used.
  • Dates can be supplied either in YYYYMM or YYYY-MM-DD format. If end-date is specified as YYYYMM, the last day of MM is used.

Memory issues for chunks in global-files

Traceback:

Error message:

Traceback (most recent call last):
  File "C:\Python27\ArcGIS10.4\Lib\runpy.py", line 162, in _run_module_as_main
    "__main__", fname, loader, pkg_name)
  File "C:\Python27\ArcGIS10.4\Lib\runpy.py", line 72, in _run_code
    exec code in run_globals
  File "T:\VALENTIN\py27\wsmtk\Scripts\processMODIS.exe\__main__.py", line 9, in <module>
  File "t:\valentin\py27\wsmtk\lib\site-packages\wsmtk\processMODIS.py", line 82, in main
    h5.update()
  File "t:\valentin\py27\wsmtk\lib\site-packages\wsmtk\modis.py", line 263, in update
    dset[blk[0]:(blk[0]+self.chunks[0]),:,uix:(uix+arr.shape[2])] = arr[...]
  File "h5py\_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
  File "h5py\_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
  File "t:\valentin\py27\wsmtk\lib\site-packages\h5py\_hl\dataset.py", line 632, in __setitem__
    self.id.write(mspace, fspace, val, mtype, dxpl=self._dxpl)
  File "h5py\_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
  File "h5py\_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
  File "h5py\h5d.pyx", line 221, in h5py.h5d.DatasetID.write
  File "h5py\_proxy.pyx", line 132, in h5py._proxy.dset_rw
  File "h5py\_proxy.pyx", line 93, in h5py._proxy.H5PY_H5Dwrite
IOError: Can't prepare for writing data (memory allocation failed for raw data chunk)

set minimum threshold of observations

Minimum threshold of observations for smoothing. Currently threshold = 0 (smoothing is executed if there's at least one observation).

Should be a precentage

Fail-check raster dimensions

Testcode results in TypeError due to dimension mismatch - check if Modis product was unique

Testcode

from wsmtk.modis import MODIShdf5
import glob

fls = glob.glob('/data/modistest/rawdata/*.hdf')

fls.sort()

fls_sub = fls[0:10]

print(fls_sub)

h5 = MODIShdf5(fls_sub)

print(h5)

h5.create()

h5.update()

Error Message

Processing |===                             | 10%Traceback (most recent call last):
  File "test.py", line 19, in <module>
    h5.update()
  File "/opt/conda/lib/python3.6/site-packages/wsmtk/modis.py", line 214, in update
    dset[...,uix+fix] = arr[...]
  File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
  File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
  File "/opt/conda/lib/python3.6/site-packages/h5py/_hl/dataset.py", line 631, in __setitem__
    for fspace in selection.broadcast(mshape):
  File "/opt/conda/lib/python3.6/site-packages/h5py/_hl/selections.py", line 299, in broadcast
    raise TypeError("Can't broadcast %s -> %s" % (target_shape, count))
TypeError: Can't broadcast (1120, 1138) -> (1120, 1137, 1)

MOD17A2HGF missing

Thought I'd mention it here, but cannot use MOD17A2HGF, results in Product MOD17A2HGF not recognized!

Cheers

self.files in ModisQuery

The instance variable self.files is defined but not used in ModisQuery.

In addition, the variable could be filled with incorrect paths.

Will be fixed in a future release.

--nworkers & --parallel-tiles

Received following traceback:

Traceback (most recent call last):
File "/opt/conda/bin/modis_smooth", line 11, in
load_entry_point('modape', 'console_scripts', 'modis_smooth')()
File "/home/jovyan/modape/modape/scripts/modis_smooth.py", line 243, in main
pool = Pool(processes=args.parallel_tiles, initializer=initfun, initargs=(processing_dict,))
File "/opt/conda/lib/python3.6/multiprocessing/pool.py", line 175, in init
self._repopulate_pool()
File "/opt/conda/lib/python3.6/multiprocessing/pool.py", line 236, in _repopulate_pool
self._wrap_exception)
File "/opt/conda/lib/python3.6/multiprocessing/pool.py", line 250, in _repopulate_pool_static
wrap_exception)
File "/opt/conda/lib/python3.6/multiprocessing/process.py", line 73, in init
assert group is None, 'group argument must be None for now'

Inconsistent smoothing

When using the optimized s-value from grid, the basic whittaker smoothing is performed which differs to the optimization run by not performing the iterative fitting of the weights.

This leads to an inconsistent timeseries when updating is performed in this fashion.

Rework MXD interleaving

Currently MOD/MYD interleaving to MXD depends on raw HDF data available for processing, which obviously won't work in operational update scenario.

--> needs to be decoupled

suspected bug in modis_window

Hi,

There seems to be something wrong with modis_window, I obtain (slightly) different results for the same data set depending on the -roi flag being set or not. Looks like some sort of projection issue, there are (presumably) regularly spaced horizontal bands of shifted pixels and some 'weird warped mesh' shift pattern on top (see screenshot; 'difference' image is scaled to emphasize the issue, real differences are obviously quite small). Tried to figure out which one's 'more correct', but didn't succeed...

modis_wind_bug

Dockerfile for v0.3 fails

Dockerfile for version 0.3 fails on build (GDAL issue)

Running GDAL-2.4.4/setup.py -q bdist_egg --dist-dir /tmp/easy_install-k7r9uqf5/GDAL-2.4.4/egg-dist-tmp-2unzqx7n
cc1plus: warning: command line option ‘-Wstrict-prototypes’ is valid for C/ObjC but not for C++
cc1plus: warning: command line option ‘-Wstrict-prototypes’ is valid for C/ObjC but not for C++
extensions/osr_wrap.cpp: In function ‘PyObject* py_OPTGetProjectionMethods(PyObject*, PyObject*)’:
extensions/osr_wrap.cpp:3417:20: error: ‘OPTGetProjectionMethods’ was not declared in this scope
papszMethods = OPTGetProjectionMethods();
^~~~~~~~~~~~~~~~~~~~~~~
extensions/osr_wrap.cpp:3417:20: note: suggested alternative: ‘py_OPTGetProjectionMethods’
papszMethods = OPTGetProjectionMethods();
^~~~~~~~~~~~~~~~~~~~~~~
py_OPTGetProjectionMethods
extensions/osr_wrap.cpp:3427:20: error: ‘OPTGetParameterList’ was not declared in this scope
papszParameters = OPTGetParameterList( papszMethods[iMethod],
^~~~~~~~~~~~~~~~~~~
extensions/osr_wrap.cpp:3427:20: note: suggested alternative: ‘papszParameters’
papszParameters = OPTGetParameterList( papszMethods[iMethod],
^~~~~~~~~~~~~~~~~~~
papszParameters
extensions/osr_wrap.cpp:3442:6: error: ‘OPTGetParameterInfo’ was not declared in this scope
OPTGetParameterInfo( papszMethods[iMethod],
^~~~~~~~~~~~~~~~~~~
extensions/osr_wrap.cpp: In function ‘OGRErr OSRSpatialReferenceShadow_StripCTParms(OSRSpatialReferenceShadow*)’:
extensions/osr_wrap.cpp:4212:12: error: ‘OSRStripCTParms’ was not declared in this scope
return OSRStripCTParms(self);
^~~~~~~~~~~~~~~
extensions/osr_wrap.cpp:4212:12: note: suggested alternative: ‘OSRSetProjParm’
return OSRStripCTParms(self);
^~~~~~~~~~~~~~~
OSRSetProjParm
extensions/osr_wrap.cpp: In function ‘OGRErr OSRSpatialReferenceShadow_FixupOrdering(OSRSpatialReferenceShadow*)’:
extensions/osr_wrap.cpp:4215:12: error: ‘OSRFixupOrdering’ was not declared in this scope
return OSRFixupOrdering(self);
^~~~~~~~~~~~~~~~
extensions/osr_wrap.cpp: In function ‘OGRErr OSRSpatialReferenceShadow_Fixup(OSRSpatialReferenceShadow*)’:
extensions/osr_wrap.cpp:4218:12: error: ‘OSRFixup’ was not declared in this scope
return OSRFixup(self);
^~~~~~~~
extensions/osr_wrap.cpp:4218:12: note: suggested alternative: ‘OGRField’
return OSRFixup(self);
^~~~~~~~
OGRField
error: Setup script exited with error: command 'gcc' failed with exit status 1

float in np.linspace

When specifying an srange for smoothing, np.linspace now raises an exception when presented with float input:

Traceback (most recent call last):
  File "/usr/local/lib/python3.6/dist-packages/modape-0.4.0-py3.6-linux-x86_64.egg/modape/scripts/modis_smooth.py", line 181, in main
    abs((float(args.srange[0])-float(args.srange[1])))/float(args.srange[2]) + 1.0)
  File "<__array_function__ internals>", line 6, in linspace
  File "/usr/local/lib/python3.6/dist-packages/numpy-1.18.1-py3.6-linux-x86_64.egg/numpy/core/function_base.py", line 121, in linspace
    .format(type(num)))
TypeError: object of type <class 'float'> cannot be safely interpreted as an integer.

Fix: change to np.arange?

modis_download require target folder to be empty

Add a optional parameter to indicate the target folder for download should be empty, in terms of .hdf files. This little tweak would play nice with modis_collect cleaning up the ingested .hdf s

Weird errors for MOD13C1 global

Creating file: T:/VALENTIN/MOD_PROCESSED//VIM/MOD13C1.006_VIM.h5 ... done.

Processing MODIS files ...

←[KProcessing |                                | 0%ERROR 4: `T:/VALENTIN/MODISDATA\MOD13C1.A2012129.006.2015246141216.hdf' not recognized as a supported file format.
Error reading T:/VALENTIN/MODISDATA\MOD13C1.A2012129.006.2015246141216.hdf ... using empty array.
←[KProcessing |=                               | 3%ERROR 4: `T:/VALENTIN/MODISDATA\MOD13C1.A2012129.006.2015246141216.hdf' not recognized as a supported file format.
Error reading T:/VALENTIN/MODISDATA\MOD13C1.A2012129.006.2015246141216.hdf ... using empty array.
←[KProcessing |==                              | 6%ERROR 4: `T:/VALENTIN/MODISDATA\MOD13C1.A2012129.006.2015246141216.hdf' not recognized as a supported file format.
Error reading T:/VALENTIN/MODISDATA\MOD13C1.A2012129.006.2015246141216.hdf ... using empty array.
←[KProcessing |===                             | 10%ERROR 4: `T:/VALENTIN/MODISDATA\MOD13C1.A2012129.006.2015246141216.hdf' not recognized as a supported file format.
Error reading T:/VALENTIN/MODISDATA\MOD13C1.A2012129.006.2015246141216.hdf ... using empty array.
←[KProcessing |====                            | 13%ERROR 4: `T:/VALENTIN/MODISDATA\MOD13C1.A2012129.006.2015246141216.hdf' not recognized as a supported file format.
Error reading T:/VALENTIN/MODISDATA\MOD13C1.A2012129.006.2015246141216.hdf ... using empty array.
←[KProcessing |=====                           | 16%ERROR 4: `T:/VALENTIN/MODISDATA\MOD13C1.A2012129.006.2015246141216.hdf' not recognized as a supported file format.

Request: Add optional parameters for rounding and capping

Request: introduce parameters for rounding and capping upon grid export (modis/window.py), Explained in more detail below:

Rounding: as the value array is retrieved from HDF(s) here...

value_array = self.get_array(dataset, ix, self.dt_gdal[1])

... next it would be to round values to an exponent of 10 (e.g. 2 => 10^2 = 100; rounding 10466 would become 10500)

Capping: all values in the retrieved array that are below or above the provided minimum or maximum respectively are set to nodata

Smoothing fails for 5th new time after initialisation

For testing purposes, I have a downloaded + collected + smoothed mini archive over only Dec 2018. From there, I am producing new time steps from Jan 1st 2019 onwards: 1-by-1, forcing an nsmooth/nupdate of 64/6. All works fine until the 5th time step (DOY 33) is downloaded and ingested; applying the whittaker filter makes MODAPE crash:
[2019-06-26 00:50:39]: Starting smoothMODIS.py ...

Running whittaker smoother with s value from grid ...

Traceback (most recent call last):
File "C:\Program Files\JetBrains\PyCharm Community Edition 2018.3.5\helpers\pydev\pydevd.py", line 1741, in
main()
File "C:\Program Files\JetBrains\PyCharm Community Edition 2018.3.5\helpers\pydev\pydevd.py", line 1735, in main
globals = debugger.run(setup['file'], None, None, is_module)
File "C:\Program Files\JetBrains\PyCharm Community Edition 2018.3.5\helpers\pydev\pydevd.py", line 1135, in run
pydev_imports.execfile(file, globals, locals) # execute the script
File "C:\Program Files\JetBrains\PyCharm Community Edition 2018.3.5\helpers\pydev_pydev_imps_pydev_execfile.py", line 18, in execfile
exec(compile(contents+"\n", file, 'exec'), glob, loc)
File "D:/ARC/Documents/2019/VAM NDVI/africa_1km/scripts/africa_1km.py", line 110, in
'nsmooth': 64, 'nupdate': 6 })
File "D:\ARC\Documents\2019\VAM NDVI\africa_1km\scripts\modis_smooth.py", line 403, in modis_smooth
smt_h5.ws2d_sgrid()
File "C:\OSGEO41\apps\Python37\lib\site-packages\modape\modis.py", line 949, in ws2d_sgrid
smt_ds[br:br+rawchunks[0], bco:bco+rawchunks[1]] = arr_smooth[:, bc:bc+rawchunks[1]]
File "h5py_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
File "h5py_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
File "C:\OSGEO4
1\apps\Python37\lib\site-packages\h5py_hl\dataset.py", line 707, in setitem
for fspace in selection.broadcast(mshape):
File "C:\OSGEO4~1\apps\Python37\lib\site-packages\h5py_hl\selections.py", line 299, in broadcast
raise TypeError("Can't broadcast %s -> %s" % (target_shape, self.mshape))
TypeError: Can't broadcast (57600, 6) -> (57600, 5)

In order to reproduce the error, please unzip the files and install the modified modis.py
Initialization: python3 scripts/africa_1km_init.py -b 2018-12-01 -e 2018-12-31 .
Production: python3 scripts/africa_1km.py -c 2018-12-31 .
africa_1km.zip

duplicates in collect

Recently NASA has been publishing duplicate versions of some hdf files.

If there are duplicates fed into the modis_collect stept, it raises an AssertionError as planned - but also an UnboundLocalError which is not intended.

Needs to be investigated further and fixed so these situations are handled in a better way.

Traceback:

Traceback (most recent call last):
  File "/opt/conda/lib/python3.6/site-packages/modape-0.2.1-py3.6-linux-x86_64.egg/modape/scripts/modis_collect.py", line 32, in run_process
    interleave=pdict['interleave'])
  File "/opt/conda/lib/python3.6/site-packages/modape-0.2.1-py3.6-linux-x86_64.egg/modape/modis.py", line 283, in __init__
    assert len(set(self.rawdates)) == self.nfiles, "Number of files not equal to number of derived dates - are there duplicate HDF files?"
AssertionError: Number of files not equal to number of derived dates - are there duplicate HDF files?

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/opt/conda/bin/modis_collect", line 11, in <module>
    load_entry_point('modape==0.2.1', 'console_scripts', 'modis_collect')()
  File "/opt/conda/lib/python3.6/site-packages/modape-0.2.1-py3.6-linux-x86_64.egg/modape/scripts/modis_collect.py", line 136, in main
    run_process(processing_dict[group])
  File "/opt/conda/lib/python3.6/site-packages/modape-0.2.1-py3.6-linux-x86_64.egg/modape/scripts/modis_collect.py", line 38, in run_process
    print('\nError processing product {}, product code {}. \n\n Traceback:\n'.format(rh5.product, vam_product_code))
UnboundLocalError: local variable 'rh5' referenced before assignment

Require optv or optvp for initial run of modis_smooth

Using modape in the scenario of (a) initial setup and (b) incremental updating (download, collect), without ever explicitly specifying an optimisation method, gives strange results. That is: when updating the dataset, NDVI values seem to receive a negative shift, resulting in smaller and negative values than expected.

Maybe users could be warned when the s grid is not initialised AND no fixed s (or p?) value is provided (because modis_smooth then does try to go with the s grid -- which is not initialised).

modis_collect clean up ingested .hdf files

As modis_collect has successfully ingested a .hdf files, optionally have this CLI tool clean up the downloaded file. This feature will play nice with an optional requirement for modis_download for the targeted folder to be empty

It would be best if modis_collect leaves a trace of the delete file, e.g. an (empty) .ingested file with otherwise the same name

Proper progress information

After multiple changes in architecture, the progress information which is output to console (e.g. progressbar) does't give much valuable info, if any.

Needs to be fixed or removed.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.