nasa / hypercp Goto Github PK
View Code? Open in Web Editor NEWLicense: Other
License: Other
To bring HyperCP into compliance with the agreed-upon guidelines, the language of the following files needs to be updated or added to the frm4soc2 branch:
README.md
HyperCP-Collaboration_Guidelines.md
Changelog.md
LICENSE.txt
Additional language for the README may be needed from NPL and ACRI for updates to instructions regarding end-to-end uncertainties and TriOS implementation.
Interpreter modules required for raw TriOS sensor data for ingestion at RAW and propagation through Level-2.
Error when running Zhang instead of Mobley.
Happens at L2 processing:
File "/mount/internal/work-st/projects/tartu-1281/FRM4SOCv2/workspace/aderu/nasa/HyperInSPACE/Source/ProcessL2.py", line 1616, in ensemblesReflectance
rhoVector, rhoUNC = RhoCorrections.ZhangCorr(WINDSPEEDXSlice,AODXSlice, CloudXSlice, SZAXSlice, SSTXSlice,
File "/mount/internal/work-st/projects/tartu-1281/FRM4SOCv2/workspace/aderu/nasa/HyperInSPACE/Source/RhoCorrections.py", line 170, in ZhangCorr
rhoDelta = Propagate.zhangWrapper(mean_vals=varlist, uncertainties=ulist,
TypeError: Propagate.zhangWrapper() got an unexpected keyword argument 'mean_vals'
Windows OS only. h5py fails to import
See description here: https://stackoverflow.com/questions/68296054/importerror-dll-load-failed-while-importing-defs-the-specified-procedure-could
I tried the solution to conda install h5py=3.6.0 -c pkgs/main within the hypercp environment. I can at least now activate hypercp, launch python, and import h5py, but when HyperCP HDFRoot.py tries (also within hypercp environment), I still get the error.
Speedup computation of rho with Zhang et al. 2017
+ Improved performance of get_prob x1.5
+ Added cache:
+ Keep LUT from Zhang et al. 2017 loaded in memory
+ Round environmental parameters and added lru_cache on get_prob
+ Replaced interpn from xarray by scipy function for increased speed
+ Added option to run interpn by chunks for datasets exceeding CPU cache (still room for improvements)
+ Limited use of xarray to only reading matlab file, all arrays are with numpy
+ Replaced OrderedDict by normal Dictionary as since python 3.6 Dictionary are ordered and code didn't require data to be ordered.
+ Updated vector direction to avoid unnecessary 2D arrays and have matrix shapes closer to the original matlab code
=> Committed code is up to 9x faster than past code on macBook Pro 2019 with Intel i9 CPU
Need to simplify interface by interpreting SeaBird cal files on import in ConfigWindow to recognize
HED/HLD -> Dark Frame
HSE/HSL - > Light Frame
And enable each by default.
ProcessL1B_FRMCal.py throws an error on call (near line 69):
wavelengths, res = Py6S.SixSHelpers.Wavelengths.run_wavelengths(s, 1e-3*wvl)
TypeError: only size-1 arrays can be converted to Python scalars
L1B pathway for class-based characterization and calibration is incomplete for SeaBird. For TriOS, this path is coded to do the same thing as factory calibration (is this intentional?)
MERRA2 AOT to be imported at beginning of L1B or earlier.
Cosine correction (CosCorr) Py6S computation times need to be reduced. Potentially use time-averaged direct/diffuse ratios. (E.g., use L2 ensemble times of ~300s to reduce Lsky and Es from O(2000) spectra to O(20) after dark correction.) Problem still remains that calibrations are applied prior to timestamp/waveband matching. Possible to move CosCorr to after matching?
Likely two PRs to resolve:
DAA: Move MERRA2 acquisition to begin L1B
AD: Move Py6S CosCorr to end L1B and use ensemble-length averages for direct/diffuse
Need to update how version is passed along to MainConfig
Model Main.Command MainConfig settings after GitLab version that includes a new .config file for command line runs
Clean up run.py to be able to use the sample data provided with the master repo as and example. Rename to run_sample.py and provide guidance in the README on how it can be adapted (and renamed) by the user.
Error in L2 process:
File "/mount/internal/work-st/projects/tartu-1281/FRM4SOCv2/workspace/aderu/miniconda3/envs/hypercp/lib/python3.11/site-packages/comet_maths/random/generate_sample.py", line 262, in generate_sample_random
sample = sample_pdf * u_param[sli_par] + param[sli_par]
ValueError: operands could not be broadcast together with shapes (100,255) (1,165)
Something to do with a missing transpose...
@aramsay, @AlexisDeru
Based on today's discussion, these uncertainties are known, but have not been rolled in. Hopefully we can incorporate these also by next week for the merge to master.
Occasionally, when running HyperCP in VSCode, this error is triggered in L1AQC->L1B processing. The problem appears to be that -- for whatever reason -- VSCode did not activate the hypercp environment properly (even though it appeared to and every other dependency is found). Shutting down HyperCP and restarting it appears to resolve the issue.
Similar issues were reported with Py6S here:
robintw/Py6S#52
Utilities.plotRadiometry is called by ProcessL1aqc_deglitch.py and AnomalyDetection.realTimePlot used to be able to project the x-axes in Light and Dark plots onto a datetime rather than serial number using matplotlib. Updates to the environment have broken this feature, and it was reverted to serial, which makes cross-referencing Light/Dark difficult.
AnomalyDetection.plotButtonPressed > Utilities.saveDeglitchPlots still exports to PNG with datetime x-axes.
Conflict. Resolution pending. Also involves L1B Interp
MainConfig.settings called in ProcessL1a.py before running loadConfig.
Running SEABIRD_SOLARTRACKER in the Factory Only pathway calculates uncertainties, as evidenced by the very slow use of Z17 despite running for M99 (i.e., to find the difference and add as an uncertainty contribution), but the L2 files have all zeros for ir/radiance and reflectance _unc datasets, and no nLw_unc dataset.
Pesky new error message pops up frequently after upgrade to macOS Ventura 13.4
2023-06-09 10:11:30.284 python[10409:3541769] IMKClient Stall detected, please Report your user scenario attaching a spindump (or sysdiagnose) that captures the problem - (imkxpc_bundleIdentifierWithReply:) block performed very slowly (2.74 secs).
Easy to reproduce, but difficult to track to a cause or troubleshoot. Appears harmless, but annoying.
I clearly broke the GUI for TriOS cal files when I automated the SeaBird cal file recognition logic.
Rewrite file ingestion codes to accommodate new file/folder/format from Tartu
Running the Zhang et al. 2017 glint correction is inefficient and slow.
Ideas to pursue:
Reduce the size of Z17 by confining it to protocol geometries (DA).
Build LUTs in lieu of full calculations for given conditions (AR).
Fails on similar broadcast error to SeaBird, but at a different location:
File "/Users/daurin/GitRepos/HyperCP/Source/ProcessL2.py", line 1690, in ensemblesReflectance
xSlice.update(instrument.Default(uncGroup, stats)) # update the xSlice dict with uncertianties and samples
File "/Users/daurin/GitRepos/HyperCP/Source/ProcessInstrumentUncertainties.py", line 137, in Default
ES_unc, LI_unc, LT_unc, ES_rel, LI_rel, LT_rel = PropagateL1B.propagate_Instrument_Uncertainty(mean_values,
File "/Users/daurin/GitRepos/HyperCP/Source/Uncertainty_Analysis.py", line 63, in propagate_Instrument_Uncertainty
sensor = self.instruments(mean_vals)
File "/Users/daurin/GitRepos/HyperCP/Source/Uncertainty_Analysis.py", line 185, in instruments
return np.array((ESLIGHT - ESDARK)ESCalESStabESLinESStrayEST*ESCos),
ValueError: operands could not be broadcast together with shapes (208,) (10,)
It is reasonable to expect that ddToDm(dmToDd(n)) == n
for all n. However, for n = 31
(that is, 0 degrees 31 minutes), this equality does not hold because of the inexact nature of floating point numbers - the value differs from expected by 3.5e-15 (it may depend on hardware and implementation). Curiously, this behavior does not occur for <anything other than 0> degrees 31 minutes.
I recommend that dmToDd
accept an optional precision
parameter (defaulting to, say, 3 or 6 decimal places) and round the results accordingly.
Currently, Anomaly analysis has to open a L1AQC file to evaluate deglitching paramterizations prior to processing to L1AQC. This requires running L1AQC with no deglitching first, then going back and opening it in the AnomAnal tool.
This should open L1A file and process with no deglitching prior to running the tool. Slower, but less confusing and error prone.
dev branch
GetAnc_ecmwf.EAC4_download_ensemble returns error "EAC4 atmospheric data could not be retrieved. Check inputs."
Currently called in L1BQC, but soon to be moved to L1B.
Philipp M. M. Groetsch, Peter Gege, Stefan G. H. Simis, Marieke A. Eleveld, and Steef W. M. Peters, "Validation of a spectral correction procedure for sun and sky reflections in above-water reflectance measurements," Opt. Express 25, A742-A761 (2017).
PG to adapt in python to L2 processing in module called by ProcessL2.py>RhoCorrections.py. Uncertainty analysis to follow separately.
It may be more user friendly to use a Docker to contain the code for distribution to avoid conda environment issues.
Solution: simplify the conda environment.yml to include less and specify build versions where possible.
L1383 is trying to do math with raw_data, but raw_data is an OrderedDict with a numpy array for each waveband, keyed by waveband string name.
The Morel & Gentili 1996/2002 f/Q BRDF correction implementation needs attention. Determination of gothic R is ambiguous, and no iteration currently implemented. Implications of scipy spline to hyperspectral f/Q are unclear.
When running for SeaBird_noTracker (factory mode), I get several warnings:
Reading : Data/hybrid_reference_spectrum_p1nm_resolution_c2020-09-21_with_unc.nc
/tcenas/proj/ocean/color/conda/miniconda3/envs/hypercp/lib/python3.10/site-packages/comet_maths/linear_algebra/matrix_calculation.py:284: UserWarning: One of the provided covariance matrix is not positivedefinite. It has been slightly changed (maximum difference of 6.467318590049452e-14 percent) to accomodate our method.
This issue is to demonstrate the process of submitting an Issue, Forking, creating a development Branch, and submitting a Pull Request (PR). The "fix" will involve demonstrating each of these steps and it will be testing by completing them according to the Guidelines.
ProcessL2.ensembleReflectances:
SeaBird in "Factory" mode should calculate class-based uncertainties using class-based RADCAL uncertainties for SeaBird.
On Ubuntu 20.04.6 LTS (but not all of them) and conda 23.3.1, using environment.yml fe65804
(test) daurin@gs616-analysis702:~/GitRepos/HyperInSPACE$ python Main.py
qt.qpa.plugin: Could not load the Qt platform plugin "xcb" in "" even though it was found.
This application failed to start because no Qt platform plugin could be initialized. Reinstalling the application may fix this problem.
Available platform plugins are: eglfs, minimal, minimalegl, offscreen, vnc, webgl, xcb.
L1AQC data propagated to L2 for uncertainties have calibrations applied in error.
Need to confirm propagated L1AQC are filtered the same as other groups in L1BQC and L2 (they should be).
Updates are required to apply proper uncertainty analysis and propagation for FRM branch processing, so that the uncertainties correspond to corrected outputs.
Near line 116 and 126, it tries to broadcast into lists (mean_values and uncertainty) data from stats with 255 elements with data from Coeff and Cal with 10 elements. resulting in, e.g.:
File "/Users/daurin/GitRepos/HyperCP/Source/ProcessL2.py", line 1690, in ensemblesReflectance
xSlice.update(instrument.Default(uncGroup, stats)) # update the xSlice dict with uncertianties and samples
File "/Users/daurin/GitRepos/HyperCP/Source/ProcessInstrumentUncertainties.py", line 116, in Default
mean_values = [stats['ES']['ave_Light'], ones*stats['ES']['ave_Dark'],
ValueError: operands could not be broadcast together with shapes (10,) (255,)
Instrument characterization files stored in EUMETSAT OCDB can be automatically downloaded and stored locally as needed
Test issue for pull request (PR) testing.
-0.05 degrees is 3 minutes (South or West).
ddToDm(-0.05)
returns 3.0, meaning 3 minutes (North or East).
-1.05 degrees is 1 degree 3 minutes (South or West).
ddToDm(-1.05)
returns -97.0, meaning 97 minutes (South or West), equivalent to 1 degree 37 minutes.
File "/Users/daurin/GitRepos/HyperCP/Source/ProcessL2.py", line 1718, in ensemblesReflectance
xUNC = instrument.factory(node, uncGroup,
^^^^^^^^^^^^^^^^^^
AttributeError: 'Trios' object has no attribute 'factory'. Did you mean: 'Factory'?
Modules are required to incorporate FRM-style full sensor characterization files (individual and class-based) and to propagate associated uncertainties from RAW through Level-2 including improved uncertainty estimates for glint and other corrections/adjustments to the radiometry.
Minor improvements
+ Update run as python package:
+ Added example on how to run multiple instances from command line (run.py)
+ Added environment variable HYPERINSPACE_CMD
+ Prevent showing error window when running in command line (error still show in standard output and logs)
+ Answered question in Main.py regarding error with pylint and dependencies: major refactoring is required which would improve the behaviour of the code when used as a python package instead of with the GUI
+ Removed unnecessary dependencies in OCproductsWindow, ProcessL2, Utilities
+ Improve row deletion of hdf files in Process L1aqc, ProcessL2, Utilities
Need to confirm that DA's RSS implementation of F0 uncertainties from Coddingtion TSIS-1 F0 into nLw uncertainties (ProcessL2 L456) conforms.
350 - 1000 nm. Should eliminated pesky NaNs. Databases should be truncated, not filled with zeros.
File "/Users/daurin/GitRepos/HyperCP/Source/ProcessInstrumentUncertainties.py", line 209, in Factory
radcal_cal = pd.DataFrame(uncGrp.getDataset(sensor + "_RADCAL_CAL").data)['2']
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
AttributeError: 'NoneType' object has no attribute 'data'
TriOS in Factory pathway fails when trying to plot nLw, because nLw_HYPER_unc has 139 np.arrays (one for each waveband), but each array has 194 values instead of 1 value for the uncertainty in nLw in that band.
Ir/radiances and Rrs all have _unc datasets set to all zeroes. nLw does not output nLw_HYPER_unc.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.