cmdty / storage Goto Github PK
View Code? Open in Web Editor NEWMulti-Factor Least Squares Monte Carlo energy storage valuation model (Python and .NET).
License: MIT License
Multi-Factor Least Squares Monte Carlo energy storage valuation model (Python and .NET).
License: MIT License
The dependency on .NET, is in turn causing a dependency on the Python.NET package for Python to .NET interop. However, this is getting in the way of running cmdty-storage on Linux as it requires Mono (and last I checked, an old version of Mono). Perversely, running the Cmdty.Storage C# package on Linux is easier, as it just requires .NET Core installed, than running the Python package!
Probably the storage calculations would run too slowly if implemented in CPython, which means the following options should be considered:
Hi! I'm trying to test the model (in python) with the below constraints:
InjectWithdrawByInventoryAndPeriod(date(2020, 8, 28),
[
InjectWithdrawByInventory(0.0, -8300, 7900),
InjectWithdrawByInventory(0.1505 * 1e6, -10300, 7900),
InjectWithdrawByInventory(0.237 * 1e6, -10300, 6700),
InjectWithdrawByInventory(0.305 * 1e6, -12300, 6700),
InjectWithdrawByInventory(0.428 * 1e6, -14300, 6700),
InjectWithdrawByInventory(0.438 * 1e6, -14300, 5500),
InjectWithdrawByInventory(0.77 * 1e6, -14300, 4000),
InjectWithdrawByInventory(0.97 * 1e6, -14300, 2500),
InjectWithdrawByInventory(1.0 * 1e6, -14300, 2500),
])
Yes, the facility I have to value has kind of ridiculous ratchet constraints. That said, upon plugging these constraints into your test code, I got the following stacktrace:
Traceback (most recent call last):
File "value_bistineau.py", line 76, in <module>
interest_rates=interest_rate_curve, num_inventory_grid_points=1000)
File "C:\miniconda\envs\cmdty-storage\lib\site-packages\cmdty_storage\intrinsic.py", line 74, in intrinsic_value
net_val_results = net_cs.IIntrinsicCalculate[time_period_type](intrinsic_calc).Calculate()
System.IndexOutOfRangeException: Index was outside the bounds of the array.
at Cmdty.TimeSeries.TimeSeries`2.get_Item(TIndex index)
at Cmdty.Storage.CmdtyStorageBuilderExtensions.<>c__DisplayClass5_0`1.<AddInjectWithdrawRanges>g__GetMinInventory|1(T period)
at Cmdty.Storage.CmdtyStorage`1.GetInjectWithdrawRange(T date, Double inventory)
at Cmdty.Storage.StorageHelper.CalculateInventorySpace[T](ICmdtyStorage`1 storage, Double startingInventory, T currentPeriod)
at Cmdty.Storage.IntrinsicStorageValuation`1.Calculate(T currentPeriod, Double startingInventory, TimeSeries`2 forwardCurve, ICmdtyStorage`1 storage, Func`2 settleDateRule, Func`3 discountFactors, Func`2 gridCalcFactory, IInterpolatorFactory interpolatorFactory, Double numericalTolerance)
at Cmdty.Storage.IntrinsicStorageValuation`1.Cmdty.Storage.IIntrinsicCalculate<T>.Calculate()
Is it possible that I've misunderstood how to specify the constraints?
The calculation of discount factors should be pushed to the caller.
A console app which accepts a file on disk as the inputs and writes the result to another file on disk. The Python API is updated to be a wrapper around this console app. Different platform specific self-contained console apps would be created containing all .NET dependencies.
This design has the following benefits:
Hi,
I do not manage to run the model since I change the value of the min_inventory... it only works with a value equal to 0.
It throws the following error :
<InventoryConstraintsCannotBeFulfilledException: Valuation calculation cannot be performed as storage inventory constraints cannot be fulfilled.
à Cmdty.Storage.StorageHelper.CalculateInventorySpace[T](ICmdtyStorage1 storage, Double startingInventory, T currentPeriod) à Cmdty.Storage.IntrinsicStorageValuation
1.Calculate(T currentPeriod, Double startingInventory, TimeSeries2 forwardCurve, ICmdtyStorage
1 storage, Func2 settleDateRule, Func
3 discountFactors, Func`2 gridCalcFactory, IInterpolatorFactory interpolatorFactory, Double numericalTolerance)>
You can reproduce the error by using the given python example in the repository https://github.com/cmdty/storage#storage-with-time-and-inventory-varying-injectwithdraw-rates and changing the min_inventory value
A problem with using the cmdty-storage Python package in with Anaconda has been found with a fix in #13. The documentation and examples should be updated to flag the problem and fix.
As highlighted in issue 13 , multiple loading of libiomp5md.dll can cause the Python interpreter to crash. The workaround suggested in the same issue, being added to the Cmdty.Storage documentation, has been seen to fix this but according to the error message printed when the crash occurs, this workaround is 'unsafe, unsupported, undocumented' and 'may cause crashes or silently produce incorrect results'. As such a proper long-term solution should be devised.
The easy fix for this is just a simple:
import os
os.environ['KMP_DUPLICATE_LIB_OK']='True'
Anyhow, on an Anaconda Python installation you get a crash from multiple OpenMP DLLs being loaded in 1 session. It seems the C# code is calling the libraries and Python doesn't like multiple copies of the same DLL. Not a huge deal when you know the "fix" I posted above. This results from just running the examples in the Readme.md for natural gas storage with ratchets.
HI Team,
i'm having this issue when i try to install python package , i have tried to upgrade pip / wheel / setuptools as well
however still getting the following error message.
Collecting cmdty_storage Using cached https://files.pythonhosted.org/packages/cc/1e/da3d37c8afb036ef121514515006a32c2d9cf6c35ff4d44f674c8447f220/cmdty_storage-0.1.0a7-py3-none-any.whl Requirement already satisfied: pandas>=0.24.2 in /usr/local/lib/python3.6/dist-packages (from cmdty_storage) (1.1.5) Collecting pythonnet==2.4.0 Using cached https://files.pythonhosted.org/packages/4a/0a/5964a5a1fdf207fab8e718edd1f8d3578e0681577fe9abe0d9006f9718c2/pythonnet-2.4.0.tar.gz Requirement already satisfied: pytz>=2017.2 in /usr/local/lib/python3.6/dist-packages (from pandas>=0.24.2->cmdty_storage) (2018.9) Requirement already satisfied: python-dateutil>=2.7.3 in /usr/local/lib/python3.6/dist-packages (from pandas>=0.24.2->cmdty_storage) (2.8.1) Requirement already satisfied: numpy>=1.15.4 in /usr/local/lib/python3.6/dist-packages (from pandas>=0.24.2->cmdty_storage) (1.19.5) Requirement already satisfied: six>=1.5 in /usr/local/lib/python3.6/dist-packages (from python-dateutil>=2.7.3->pandas>=0.24.2->cmdty_storage) (1.15.0) Building wheels for collected packages: pythonnet Building wheel for pythonnet (setup.py) ... error ERROR: Failed building wheel for pythonnet Running setup.py clean for pythonnet Failed to build pythonnet Installing collected packages: pythonnet, cmdty-storage Running setup.py install for pythonnet ... error ERROR: Command errored out with exit status 1: /usr/bin/python3 -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-r07633fo/pythonnet/setup.py'"'"'; __file__='"'"'/tmp/pip-install-r07633fo/pythonnet/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' install --record /tmp/pip-record-80vpah_w/install-record.txt --single-version-externally-managed --compile Check the logs for full command output.
Thank you for your help
Hi,
we've implemented your model in a context where is necessary to have time varying ratchets across the simulation period, in order to reach target filling levels on specific dates.
Depending on the starting inventory level and the ratchets for that time frame, we are incurring into this strange error which seems to be related to some non-well managed rounding operations behind the scenes:
self.results: MultiFactorValuationResults = three_factor_seasonal_value( File "/azureml-envs/azureml_56ff17e21e8cba7e88bd7ac0a5656619/lib/python3.10/site-packages/cmdty_storage/multi_factor.py", line 361, in three_factor_seasonal_value return _net_multi_factor_calc(cmdty_storage, fwd_curve, interest_rates, inventory, add_multi_factor_sim, File "/azureml-envs/azureml_56ff17e21e8cba7e88bd7ac0a5656619/lib/python3.10/site-packages/cmdty_storage/multi_factor.py", line 460, in _net_multi_factor_calc intrinsic_result = cs_intrinsic.net_intrinsic_calc(cmdty_storage, net_current_period, net_interest_rate_time_series, File "/azureml-envs/azureml_56ff17e21e8cba7e88bd7ac0a5656619/lib/python3.10/site-packages/cmdty_storage/intrinsic.py", line 82, in net_intrinsic_calc net_val_results = net_cs.IIntrinsicCalculate[time_period_type](intrinsic_calc).Calculate() System.ArgumentException: Inventory of 6872.954213832093 is below minimum allowed value of 6873 during period 2021-11-01. (Parameter 'inventory') at Cmdty.Storage.CmdtyStorage
1.GetInjectWithdrawRange(T date, Double inventory) in C:\Users\Jake\source\repos\storage\src\Cmdty.Storage\StorageEntity\CmdtyStorage.cs:line 94
at Cmdty.Storage.StorageHelper.CalculateInventorySpace[T](ICmdtyStorage1 storage, Double startingInventory, T currentPeriod) in C:\Users\Jake\source\repos\storage\src\Cmdty.Storage\StorageHelper.cs:line 42 at Cmdty.Storage.IntrinsicStorageValuation
1.Calculate(T currentPeriod, Double startingInventory, TimeSeries2 forwardCurve, ICmdtyStorage
1 storage, Func2 settleDateRule, Func
3 discountFactors, Func2 gridCalcFactory, IInterpolatorFactory interpolatorFactory, Double numericalTolerance) in C:\Users\Jake\source\repos\storage\src\Cmdty.Storage\IntrinsicValuation\IntrinsicStorageValuation.cs:line 154 at Cmdty.Storage.IntrinsicStorageValuation
1.Cmdty.Storage.IIntrinsicCalculate.Calculate() in C:\Users\Jake\source\repos\storage\src\Cmdty.Storage\IntrinsicValuation\IntrinsicStorageValuation.cs:line 116
`
We tried to use RatchetInterp.STEP, but this is apparently failing whenever ratchets are time-varying.
Any idea to prevent this error?
Some testing found that the standard error calculation contains a bug. As antithetic variance reduction is used, all PV sims are not independent hence the standard error calculation is incorrect. The solution is to take the average of all pairs of PVs calculated from the same random numbers and then calculate standard error as usual.
The python dependencies for build/ci/test is a bit of a mess, being found in a requirements.txt, build.cake and azure-pipelines.yml. This should be consolidated and tidied up into requirement.txt files with specific versions specified.
InventoryConstraintsCannotBeFulfilledException: Valuation calculation cannot be performed as storage inventory constraints cannot be fulfilled.
at Cmdty.Storage.StorageHelper.CalculateInventorySpace[T](ICmdtyStorage1 storage, Double startingInventory, T currentPeriod) at Cmdty.Storage.IntrinsicStorageValuation
1.Calculate(T currentPeriod, Double startingInventory, TimeSeries2 forwardCurve, ICmdtyStorage
1 storage, Func2 settleDateRule, Func
3 discountFactors, Func2 gridCalcFactory, IInterpolatorFactory interpolatorFactory, Double numericalTolerance) at Cmdty.Storage.IntrinsicStorageValuation
1.Cmdty.Storage.IIntrinsicCalculate.Calculate()
With following time-varying inventory, setting min_inventory to even 3% of max_inventory, 6 month after storage_start is failing the constraints.
Code:
max_inventory = 1.0 # MWh
injection_duration = 100.0 # days
withdrawal_duration = injection_duration / 2.0 # days
max_injection_rate = max_inventory / injection_duration # MWh / day
max_withdrawal_rate = max_inventory / withdrawal_duration # MWh / day
yearly_period_index = pd.PeriodIndex(freq='D', data = ['2023-04-01', '2023-10-01', '2024-04-01'])
storage_time_varying = CmdtyStorage(
freq='D',
storage_start = '2023-04-01',
storage_end = '2024-04-01',
injection_cost = 0.0,
withdrawal_cost = 0.0,
min_inventory = pd.Series([0.0, 0.03 * max_inventory, 0.0], yearly_period_index).resample('D').fillna('pad'),
max_inventory = pd.Series([max_inventory, max_inventory, max_inventory], yearly_period_index).resample('D').fillna('pad'),
max_injection_rate = max_injection_rate,
max_withdrawal_rate = max_withdrawal_rate
)
iv_results = intrinsic_value(
cmdty_storage = storage_time_varying,
val_date = storage_time_varying.start,
inventory = 0 * max_inventory,
forward_curve = fwd_curve,
interest_rates = ir_curve,
settlement_rule = settlement_rule)
Use a dockerfile to get .NET runtime.
Warnings seen when running the Pack-Python cake build step with setuptools version 65.5.0, but not version 56.0.0.
The first warning related to the use of the package_data parameter to setuptools.setup and is fixed in the branch put_package_data_into_packages.
Another warning seen, but yet to be investigated is the following:
SetuptoolsDeprecationWarning: setup.py install is deprecated. Use build and pip and other standards-based tools.
This sort of issue should be helped (made more predictable) when #35 is implemented by fixing the version of all dependency packages.
Currently both these simulations use the same number of paths, but there should be the flexibility for these to be different.
### In storage_gui.py:
RatchetRow = namedtuple('RatchetRow', ['date', 'inventory', 'inject_rate', 'withdraw_rate'])
def enumerate_ratchets():
ratchet_row = 0
while ratchet_row < num_ratch_rows and ratchet_input_sheet.cells[1].value[ratchet_row] != '':
yield RatchetRow(ratchet_input_sheet.cells[0].value[ratchet_row], ratchet_input_sheet.cells[1].value[ratchet_row],
ratchet_input_sheet.cells[2].value[ratchet_row], ratchet_input_sheet.cells[3].value[ratchet_row])
ratchet_row += 1
def read_ratchets():
ratchets = []
for ratchet in enumerate_ratchets():
if ratchet.date != '':
dt_item = (pd.Period(ratchet.date, freq=freq), [(ratchet.inventory, -ratchet.inject_rate, ratchet.withdraw_rate)])
ratchets.append(dt_item)
else:
dt_item[1].append((ratchet.inventory, -ratchet.inject_rate, ratchet.withdraw_rate))
return ratchets
### In creating_storage_instance notebook:
The second element of the tuple as an iterable of tuples, each with three elements:
The first element is the inventory.
The second element is the maximum withdrawal rate.
The third element is the maximum injection rate.
I am able to match the results to simple storage by flipping them around in the storage_gui.
It would be great to see the option to simulate prices outside the function such that bespoke views of distributions can be used for valuations
Hello,
I would like to understand how to calculate the mean reversion as I dont understand the figure you use in your exemple (mean reversion = 91)
When I used the Ornstein-Uhlenbeck for my historical spot prices, I obtained alfa = 0.017 and average = 35.76.
What does this 91 represent in your example? How do you calculate it ?
Many thanks
The documentation, and implementation in MultiFactorModel, only derive the integrated covariance of (log of) forwards at a uniform granularity, which will generally be the highest possible granularity of traded contracts assumed by the model, and the granularity at which physical storage nomination occurs. For example for natural gas, when valuing storage with day-ahead nomination, the granularity will be daily. However, for calibration and simulation purposes it would be useful to approximate the covariances for contracts at the actual traded granularity, e.g. monthly, quarter, seasonal and yearly.
One approach would be to use moment-matching, i.e. find the parameters of a single GBM (Geometric Brownian Motion) which matches the moments of a basket of GBMs.
For 1 and 2 factor models, something like Jamshidian Trick could be used to calculate the PV of options, which in turn could be used to back out the implied volatility of the traded contract which are implied by a specific set of model parameters.
MultiFactorModel and MultiFactorSpotSim can by used from the cmdty_storage package, but no docs are currently provided.
Update build.cake to create Python Conda package during CI, and publish to conda-forge on release. Also updated Python testing task in build.cake to perform tests in a Conda environment.
A decent performance improvement could be achieve by more vectorisation of calculations of arrays of numbers, e.g. add decision volume to inventory to get next step inventory over the vector of per-simulation inventories.
Should be done after #28 to avoid a divergence in performance over different OS.
Hi,
I am trying to use the 3 factor model to simulate random hourly electricity price cuves based on a volatility curve and a forward curve. However my forward curve has négative prices…which is not allowed in the model.
Do you know how I can deal with this issue?
thanks
Currently the model assumes that injection/withdrawal nomination occurs in the same time period as delivery. In reality virtual storage deals could enforce nomination to be on the previous business day, for example the weekend volume is nominated on Friday.
The storage object needs to be enriched to encapsulate the set of nomination dates, plus a mapping to the delivery dates each nomination date corresponds to. The valuation code will also need to be updated to simulate not just the spot price, but the prompt forward prices for nomination delivery dates as of the nomination date.
Currently regression uses QR decomposition is used for regression in LSMC valuation. Previous experiments showed that the performance of using SVD was much worse than QR, especially as the number of simulations increased. However, some more recent reading implies that SVD performance for regression, relative to QR, is not as bad a observed. I suspect this is because the previous code using SVD had some inefficiencies. If similar performance using SVD can be achieved, it is preferable to QR due to being more robust to collinear regressors.
Also copy same documentation to curves project.
Hi and thx for this model.
I am making a simpel model when I change max_injection rate from xxx to xxxx i.e 2500
min_inventory = 0.0,
max_inventory = 500000.0,
max_injection_rate = 2500.5,
max_withdrawal_rate = 3000.9
then I get following error but it says it can be fixed by increasing numerical tolerance but where can I fix it ?
Inventory constraints cannot be fulfilled. This could potentially be fixed by increasing the numerical tolerance.
ved Cmdty.Storage.StorageHelper.CalculateBangBangDecisionSet(InjectWithdrawRange injectWithdrawRange, Double currentInventory, Double inventoryLoss, Double nextStepMinInventory, Double nextStepMaxInventory, Double numericalTolerance, Int32 numExtraDecisions)
ved Cmdty.Storage.IntrinsicStorageValuation1.OptimalDecisionAndValue(ICmdtyStorage
1 storage, T period, Double inventory, Double nextStepInventorySpaceMin, Double nextStepInventorySpaceMax, Double cmdtyPrice, Func2 continuationValueByInventory, Double discountFactorFromCmdtySettlement, Func
2 discountFactors, Double numericalTolerance)
ved Cmdty.Storage.IntrinsicStorageValuation1.Calculate(T currentPeriod, Double startingInventory, TimeSeries
2 forwardCurve, ICmdtyStorage1 storage, Func
2 settleDateRule, Func3 discountFactors, Func
2 gridCalcFactory, IInterpolatorFactory interpolatorFactory, Double numericalTolerance)
ved Cmdty.Storage.IntrinsicStorageValuation`1.Cmdty.Storage.IIntrinsicCalculate.Calculate()
Currently there is one Python package which will be used for all OS and architecture. However, the MathNet (including MKL) binaries included are Windows-specific. This won't cause an error on non-Windows OS, it just means that on non-Windows the matrix routines will fall back to those implemented in C#, so will be slower.
Delta is calculated using pathwise differentiation alongside the core Least Squares Monte Carlo valuation calculations in LsmcStorageValuation.cs. These calculations makes an assumption that simulated spot price is calculated as forward prices times some stochastic term. This is fine for the multifactor model in Cmdty.Core, but will not be the case for all models, e.g. a shifted lognormal model to account for negative prices. As such, the delta calculation code should be moved out of the core C# valuation class. Such a pathwise differentiation could be calculated by callers use the simulated data.
This limitation should be considered by anyone calling the Python value_from_sims function.
Hi all,
Thank you for this great job. I am trying to use it to optimize and value power storage facilities hourly, but I am facing two issues.
Example: t1: injection = 1; t2: withdrawing = 1 means the total absolute energy used over the two periods is 2
Thank you for the help
Hi, I have some questions regarding the expected_profile that is returned from the extrinsic valuation methods. Most of my code uses the value_from_sims method as I have written my own implementation of the multifactor price process simulator, but I also see this issue when I use your three_factor_seasonal_value method that performs price simulations for example.
Basically I am trying to better understand where the model's extrinsic NPV actually comes from in a more concrete sense. I have been looking at the "period_pv" column / "net_volume" column to try and derive an understanding of the price that the model is using to assign a PV to the expectation of the volume decision at the point in time. This has lead to some very interesting observations, where I see cases that the model is buying 20000 units of a gas in a month at a negative price for a profit, or selling a small amount of gas at a very high price, say 100 units of gas at $100/unit for $10k PV.
These cases occur in situations where my distribution of simulated prices does not include negative prices, or prices even above $14 for example. I have tried playing with a lot of the knobs to see if I can affect this issue in a meaningful way, but don't believe that I have yet. Perhaps I am just misunderstanding the interpretability of the values in the expected_profile. I can provide more context if necessary.
Any thoughts?
Coverlet stopped working a while ago, I think after updating .NET SDK version.
The current Excel add-in function asynchronous calculations doesn't work as well as hoped for because Excel DNA RTD updates do not update when Excel calculations model is manual. This means that:
The proposes solution is for the Cmdty.Storage add-in to have two calculation modes itself, controlled via a drop-down in the Excel Ribbon:
Please provide the Python code mentioned in the documentation to calibrate the 3-factor seasonal model. An example script which takes the historical prices as input and calculates: spot_mean_reversion, spot_vol, long_term_vol, seasonal_vol. That would make this project much more user friendly.
I can organize a validation of your model if you wish, and report to you the results.
Thanks for all the modeling! Great project!
Currently the full build can only be executed on Windows machines. This is due to two reasons:
The solution to the first reason is to just wait for version 2.5.0 of pythonnet. For the second reason, although there has been some discussion on GitHub for a .NET Core compatible version of Excel-DNA, it doesn't appear to be coming any time soon. So probably the solution to this is to create a second "non-Windows friendly" .sln file which doesn't include the Excel project. Some changes to build.cake would also probably be necessary.
Storage with step ratchets and terminal_storage_npv not specified (i.e. forces storage to be empty) rarely is able to satisfy the inventory constraints. Until this is properly fixed the Python API CmdtyStorage constructor should throw an exception if terminal_storage_npv not set and step ratchets used. Also update documenation.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.