GithubHelp home page GithubHelp logo

pypsa / linopy Goto Github PK

View Code? Open in Web Editor NEW
145.0 7.0 38.0 2.88 MB

Linear optimization with N-D labeled arrays in Python

Home Page: https://linopy.readthedocs.io

License: MIT License

Python 98.97% Jupyter Notebook 0.73% Julia 0.30%
optimisation python optimization xarray cbc gurobi cplex glpk xpress linear-optimization

linopy's Introduction

PyPSA - Python for Power System Analysis

PyPI version Conda version CI CI with micromamba Code coverage Documentation Status License Zenodo Examples of use pre-commit.ci status Code style: black Discord Contributor Covenant Stack Exchange questions

PyPSA stands for "Python for Power System Analysis". It is pronounced "pipes-ah".

PyPSA is an open source toolbox for simulating and optimising modern power and energy systems that include features such as conventional generators with unit commitment, variable wind and solar generation, storage units, coupling to other energy sectors, and mixed alternating and direct current networks. PyPSA is designed to scale well with large networks and long time series.

This project is maintained by the Department of Digital Transformation in Energy Systems at the Technical University of Berlin. Previous versions were developed by the Energy System Modelling group at the Institute for Automation and Applied Informatics at the Karlsruhe Institute of Technology funded by the Helmholtz Association, and by the Renewable Energy Group at FIAS to carry out simulations for the CoNDyNet project, financed by the German Federal Ministry for Education and Research (BMBF) as part of the Stromnetze Research Initiative.

Functionality

PyPSA can calculate:

  • static power flow (using both the full non-linear network equations and the linearised network equations)
  • linear optimal power flow (least-cost optimisation of power plant and storage dispatch within network constraints, using the linear network equations, over several snapshots)
  • security-constrained linear optimal power flow
  • total electricity/energy system least-cost investment optimisation (using linear network equations, over several snapshots and investment periods simultaneously for optimisation of generation and storage dispatch and investment in the capacities of generation, storage, transmission and other infrastructure)

It has models for:

  • meshed multiply-connected AC and DC networks, with controllable converters between AC and DC networks
  • standard types for lines and transformers following the implementation in pandapower
  • conventional dispatchable generators and links with unit commitment
  • generators with time-varying power availability, such as wind and solar generators
  • storage units with efficiency losses
  • simple hydroelectricity with inflow and spillage
  • coupling with other energy carriers (e.g. resistive Power-to-Heat (P2H), Power-to-Gas (P2G), battery electric vehicles (BEVs), Fischer-Tropsch, direct air capture (DAC))
  • basic components out of which more complicated assets can be built, such as Combined Heat and Power (CHP) units and heat pumps.

Documentation

Documentation

Quick start

Examples

Known users of PyPSA

Installation

pip:

pip install pypsa

conda/mamba:

conda install -c conda-forge pypsa

Additionally, install a solver.

Usage

import pypsa

# create a new network
n = pypsa.Network()
n.add("Bus", "mybus")
n.add("Load", "myload", bus="mybus", p_set=100)
n.add("Generator", "mygen", bus="mybus", p_nom=100, marginal_cost=20)

# load an example network
n = pypsa.examples.ac_dc_meshed()

# run the optimisation
n.optimize()

# plot results
n.generators_t.p.plot()
n.plot()

# get statistics
n.statistics()
n.statistics.energy_balance()

There are more extensive examples available as Jupyter notebooks. They are also described in the doc/examples.rst and are available as Python scripts in examples/.

Screenshots

PyPSA-Eur optimising capacities of generation, storage and transmission lines (9% line volume expansion allowed) for a 95% reduction in CO2 emissions in Europe compared to 1990 levels

image

SciGRID model simulating the German power system for 2015.

image

image

Dependencies

PyPSA is written and tested to be compatible with Python 3.7 and above. The last release supporting Python 2.7 was PyPSA 0.15.0.

It leans heavily on the following Python packages:

  • pandas for storing data about components and time series
  • numpy and scipy for calculations, such as linear algebra and sparse matrix calculations
  • networkx for some network calculations
  • matplotlib for static plotting
  • linpy for preparing optimisation problems (currently only linear and mixed integer linear optimisation)
  • cartopy for plotting the baselayer map
  • pytest for unit testing
  • logging for managing messages

The optimisation uses interface libraries like linopy which are independent of the preferred solver. You can use e.g. one of the free solvers HiGHS, GLPK and CLP/CBC or the commercial solver Gurobi for which free academic licenses are available.

Documentation

Please check the documentation.

Contributing and Support

We strongly welcome anyone interested in contributing to this project. If you have any ideas, suggestions or encounter problems, feel invited to file issues or make pull requests on GitHub.

  • In case of code-related questions, please post on stack overflow.
  • For non-programming related and more general questions please refer to the mailing list.
  • To discuss with other PyPSA users, organise projects, share news, and get in touch with the community you can use the discord server.
  • For bugs and feature requests, please use the PyPSA Github Issues page.
  • For troubleshooting, please check the troubleshooting in the documentation.

Code of Conduct

Please respect our code of conduct.

Citing PyPSA

If you use PyPSA for your research, we would appreciate it if you would cite the following paper:

Please use the following BibTeX:

@article{PyPSA,
   author = {T. Brown and J. H\"orsch and D. Schlachtberger},
   title = {{PyPSA: Python for Power System Analysis}},
   journal = {Journal of Open Research Software},
   volume = {6},
   issue = {1},
   number = {4},
   year = {2018},
   eprint = {1707.09913},
   url = {https://doi.org/10.5334/jors.188},
   doi = {10.5334/jors.188}
}

If you want to cite a specific PyPSA version, each release of PyPSA is stored on Zenodo with a release-specific DOI. The release-specific DOIs can be found linked from the overall PyPSA Zenodo DOI for Version 0.17.1 and onwards:

image

or from the overall PyPSA Zenodo DOI for Versions up to 0.17.0:

image

Licence

Copyright 2015-2024 PyPSA Developers

PyPSA is licensed under the open source MIT License.

linopy's People

Contributors

apfelix avatar aurelije avatar cellophil avatar coroa avatar d3netxer avatar dannyopts avatar danuriegas avatar fabianhofmann avatar fneum avatar glatterf42 avatar hblunck avatar irieo avatar jafluri avatar jankaeh avatar leuchtum avatar lindnemi avatar lucierc avatar lumbric avatar maurerle avatar odow avatar pre-commit-ci[bot] avatar pz-max avatar staadecker avatar tgi-climact avatar ulfworsoe avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

linopy's Issues

Add automatic merging of variables

At the moment linopy will create two coefficients if the same variable is used twice in the same expression. Gurobi can resolve this automatically but it prints a warning, but HiGHS seems to struggle with such equations (see example below). A solution to this would be to merge identical variables and calculate the resulting coefficient in linopy before creating the LP file or passing the equations to the solver.

I am not entirely sure whether this is the right approach nor how other solvers are dealing with this. I don't think that HiGHS is handling this correctly, so considering this as an upstream issue in HiGHS would be also valid. But given that Gurobi prints warnings, I guess solvers prefer to have this handled by the optimization language.

Example

Consider the following example:

N = 5
m = linopy.Model()
var = m.add_variables(
    name="var", lower=xr.DataArray(np.zeros(N), dims="x", coords={"x": np.arange(N)})
)

m.add_constraints((var - 0.5 * var) == 21.0)
m.add_objective(var.sum())
m.solve(solver_name="gurobi", keep_files=True)

Gurobi prints:

...
Warning: row 0 (name "c0") contains 1 repeated variable(s) in constraints, grouped them, 0 cancellation
Warning: row 1 (name "c1") contains 1 repeated variable(s) in constraints, grouped them, 0 cancellation
Warning: row 2 (name "c2") contains 1 repeated variable(s) in constraints, grouped them, 0 cancellation
Warning: row 3 (name "c3") contains 1 repeated variable(s) in constraints, grouped them, 0 cancellation
Warning: row 4 (name "c4") contains 1 repeated variable(s) in constraints, grouped them, 0 cancellation
Warning: lp file contains 5 repeated variable(s) in constraints, grouped them, 0 cancellation
...

Highs cannot solve this model at all. I aborted the solving after 10 minutes (Gurobi can solve it in less than a second).

This equivalent problem can be solved with Gurobi and HiGHS in a 100-200ms without warnings printed:

N = 5
m = linopy.Model()
var = m.add_variables(
    name="var", lower=xr.DataArray(np.zeros(N), dims="x", coords={"x": np.arange(N)})
)

m.add_constraints((0.5 * var) == 21.0)
m.add_objective(var.sum())
m.solve(solver_name="highs", keep_files=True)

So instead having coefficients 1 and -0.5, a preprocessing step would need to compute 0.5 as the coefficient for var.

At the moment these are the coefficients:

Coefficients

constants as part of a linear expression?

More of a question than a feature request, possibly it's complicated:

Linear expressions are usually a sum over variables with prefactors plus a constant. That constant needs to go on the RHS in the usual constraint formulation, that's probably the reason why the linearExpr object does not support constants. However, if I wanted make use of the nice algebra of the linearExpr object for, say, "1-x", I have to separate off the constant early. In power systems I might want to express something like "20% of headroom" = 0.2 (p_nom-p).

My hacky solution would be to create a neutral variable=1 early on, and work with that, that should be fine. Could that be an option overall, say referencing a variable index -42 or something, and when creating a constraint having a logic sort it over to the RHS?

Good practice when creating constraints

I am considering how to structure my code into modular blocks so that I can better test each aspect of model creation using linopy.

At present, I create a whole bunch of Variables and assign the Variable to a Python variable e.g.

new_capacity = m.add_variables(lower=0, upper=inf, coords=coords, name='NewCapacity', integer=False)

Ideally, I'd like to put all the variable creation in a function:

def add_variables(m: Model) -> Model:
    m.add_variables(lower=0, upper=inf, coords=coords, name='NewCapacity', integer=False)
    return m

Now I can access my variables like so m['NewCapacity'].

When adding constraints, is there a preference between the following alternatives?

First with Variables stored in variables:

accumulated_new_capacity = m['AccumulatedNewCapacity']
new_capacity = m['NewCapacity']
def bounds(model, r, t, y):
    return accumulated_new_capacity[r,t,y] - 1 * \
                sum(new_capacity[r,t,yy] for yy in ds.coords['YEAR'].values 
                    if (y-yy >= 0) and (y-yy < ds['OperationalLife'].sel({'REGION':r, 'TECHNOLOGY': t}))) == 0
    
c_aa1_total_new_capacity = m.add_constraints(bounds, coords=RTeY, name='CAa1_TotalNewCapacity')

versus second option with access to Variable in the function

def bounds(model, r, t, y):
    return m['AccumulatedNewCapacity'][r,t,y] - 1 * \
                sum(m['NewCapacity'][r,t,yy] for yy in ds.coords['YEAR'].values 
                    if (y-yy >= 0) and (y-yy < ds['OperationalLife'].sel({'REGION':r, 'TECHNOLOGY': t}))) == 0
    
c_aa1_total_new_capacity = m.add_constraints(bounds, coords=RTeY, name='CAa1_TotalNewCapacity')

Or is there no difference that is relevant?

Cannot run example (Command line tool 'which' not available on Windows)

Hi,

first of all for this very promising looking package! However, I find myself unable to run the example, as I get the following error:

from linopy import Model

---------------------------------------------------------------------------
FileNotFoundError                         Traceback (most recent call last)
<ipython-input-1-91f35b98b0bb> in <module>
----> 1 from linopy import Model

~\anaconda3\lib\site-packages\linopy\__init__.py in <module>
      7 """
      8 
----> 9 from linopy import model
     10 from linopy.expressions import merge
     11 from linopy.io import read_netcdf

~\anaconda3\lib\site-packages\linopy\model.py in <module>
     18 from xarray import DataArray, Dataset
     19 
---> 20 from linopy import solvers
     21 from linopy.common import best_int, replace_by_map
     22 from linopy.constraints import Constraints

~\anaconda3\lib\site-packages\linopy\solvers.py in <module>
     18 
     19 
---> 20 if sub.run(["which", "glpsol"], stdout=sub.DEVNULL).returncode == 0:
     21     available_solvers.append("glpk")
     22 

~\anaconda3\lib\subprocess.py in run(input, capture_output, timeout, check, *popenargs, **kwargs)
    487         kwargs['stderr'] = PIPE
    488 
--> 489     with Popen(*popenargs, **kwargs) as process:
    490         try:
    491             stdout, stderr = process.communicate(input, timeout=timeout)

~\anaconda3\lib\subprocess.py in __init__(self, args, bufsize, executable, stdin, stdout, stderr, preexec_fn, close_fds, shell, cwd, env, universal_newlines, startupinfo, creationflags, restore_signals, start_new_session, pass_fds, encoding, errors, text)
    852                             encoding=encoding, errors=errors)
    853 
--> 854             self._execute_child(args, executable, preexec_fn, close_fds,
    855                                 pass_fds, cwd, env,
    856                                 startupinfo, creationflags, shell,

~\anaconda3\lib\subprocess.py in _execute_child(self, args, executable, preexec_fn, close_fds, pass_fds, cwd, env, startupinfo, creationflags, shell, p2cread, p2cwrite, c2pread, c2pwrite, errread, errwrite, unused_restore_signals, unused_start_new_session)
   1305             # Start the process
   1306             try:
-> 1307                 hp, ht, pid, tid = _winapi.CreateProcess(executable, args,
   1308                                          # no special security
   1309                                          None, None,

FileNotFoundError: [WinError 2] Das System kann die angegebene Datei nicht finden

I'm using python via anaconda on Windows 11.

Best regards

Solving time

linopy version: 0.0.15
Gurobi Optimizer version: 10.0.0

Describe the bug

I was solving an energy model with 10 nodes, 744 time steps; gas, solar, wind onshore and wind offshore production; battery and hydrogen storage.

The information returned in the console about time solving is:
"Barrier solved model in 77 iterations and 4.07 seconds (4.14 work units)
Optimal objective 9.06616245e+09
[...]
Solved in 47816 iterations and 4.59 seconds (5.24 work units)
Optimal objective 9.066162454e+09"

But when I measure real time solving by using time.time() in the line just before and the line just after model.solve(), I obtain 4442 seconds, which seems to me really different from the information returned in the console.

I also tried with 24, 120, 240 and 480 time steps, and I had same problem.

Solving information

Writing constraints.: 100%|██████████| 10425/10425 [01:02<00:00, 166.71it/s]
Writing variables.: 100%|██████████| 8/8 [00:00<00:00, 110.25it/s]
Read LP format model from file C:\Users\Eva\AppData\Local\Temp\linopy-problem-utbena5t.lp
Reading time = 0.21 seconds
obj: 126481 rows, 91595 columns, 367979 nonzeros
Gurobi Optimizer version 10.0.0 build v10.0.0rc2 (win64)

CPU model: 12th Gen Intel(R) Core(TM) i9-12900K, instruction set [SSE2|AVX|AVX2]
Thread count: 16 physical cores, 24 logical processors, using up to 24 threads

Optimize a model with 126481 rows, 91595 columns and 367979 nonzeros
Model fingerprint: 0xaa3f369a
Coefficient statistics:
Matrix range [1e-06, 2e+02]
Objective range [1e-02, 1e+05]
Bounds range [4e+02, 1e+07]
RHS range [6e+02, 6e+07]
Presolve removed 7713 rows and 7715 columns
Presolve time: 0.15s
Presolved: 118768 rows, 83880 columns, 354826 nonzeros

Concurrent LP optimizer: primal simplex, dual simplex, and barrier
Showing barrier log only...

Ordering time: 0.01s

Barrier statistics:
Dense cols : 81
Free vars : 6695
AA' NZ : 3.834e+05
Factor NZ : 1.991e+06 (roughly 100 MB of memory)
Factor Ops : 1.134e+08 (less than 1 second per iteration)
Threads : 14

Objective Residual
Iter Primal Dual Primal Dual Compl Time
0 2.05908200e+14 -3.84291651e+15 6.96e+10 4.15e+03 5.26e+12 0s
1 2.37145030e+14 -2.07731861e+15 5.70e+10 2.37e+05 2.48e+12 0s
2 1.39281994e+14 -1.68120405e+15 3.02e+10 8.45e+04 1.17e+12 0s
3 7.48499079e+13 -1.75250575e+15 1.44e+10 4.79e+04 6.36e+11 0s
4 3.63519649e+13 -1.80307024e+15 4.90e+09 1.02e+04 1.71e+11 0s
5 2.55046947e+13 -1.28282703e+15 2.44e+09 4.28e+03 8.11e+10 1s
6 2.08641032e+13 -9.86927082e+14 1.45e+09 2.37e+03 4.72e+10 1s
7 1.79799724e+13 -6.62212421e+14 8.66e+08 9.19e+02 2.36e+10 1s
8 1.62996655e+13 -5.56145461e+14 5.71e+08 6.13e+02 1.58e+10 1s
9 1.48920139e+13 -4.10745223e+14 3.43e+08 2.68e+02 8.71e+09 1s
10 1.27036089e+13 -2.36710884e+14 5.17e+07 7.19e+01 2.35e+09 1s
11 1.08221505e+13 -2.72790232e+13 1.94e+07 4.03e+00 3.41e+08 1s
12 7.81385424e+12 -1.25459861e+13 7.89e+06 1.70e+00 1.56e+08 1s
13 4.64090193e+12 -9.21035431e+12 1.41e+06 1.26e+00 8.41e+07 1s
14 3.66071977e+12 -7.80788460e+12 9.67e+05 2.21e+00 6.75e+07 1s
15 2.15706917e+12 -2.33085366e+12 3.45e+05 3.11e+00 2.49e+07 1s
16 1.25829759e+12 -1.34067622e+12 1.44e+05 1.40e+00 1.40e+07 1s
17 5.96802858e+11 -6.02885956e+11 5.01e+04 1.09e+00 6.36e+06 1s
18 3.09261641e+11 -1.91234391e+11 3.61e+03 4.14e-01 2.63e+06 1s
19 1.95736031e+11 -1.03025108e+11 4.22e+00 2.24e-01 1.57e+06 1s
20 1.27169044e+11 -7.91083101e+10 2.62e+00 3.63e-01 1.08e+06 1s
21 7.53853579e+10 -2.61494560e+10 1.44e+00 3.89e-01 5.32e+05 1s
22 5.26230399e+10 -6.21750537e+09 9.20e-01 3.65e-01 3.08e+05 1s
23 3.10837474e+10 1.47160160e+09 3.70e-01 2.20e-01 1.55e+05 1s
24 1.77664833e+10 2.31139980e+09 1.35e-01 8.76e-02 8.08e+04 1s
25 1.38019323e+10 5.99811191e+09 7.58e-02 3.60e-02 4.08e+04 1s
26 1.17194986e+10 6.98805479e+09 4.39e-02 2.79e-02 2.47e+04 1s
27 1.08168370e+10 7.57637732e+09 2.98e-02 6.76e-02 1.69e+04 1s
28 1.05474688e+10 7.79530339e+09 2.54e-02 5.62e-02 1.44e+04 1s
29 1.02049470e+10 7.98688715e+09 2.01e-02 4.73e-02 1.16e+04 1s
30 1.01387929e+10 8.07818319e+09 1.89e-02 4.35e-02 1.08e+04 1s
31 1.00498951e+10 8.25265912e+09 1.70e-02 3.60e-02 9.40e+03 2s
32 9.90306459e+09 8.39870399e+09 1.44e-02 4.86e-02 7.87e+03 2s
33 9.83678361e+09 8.44772011e+09 1.31e-02 4.52e-02 7.26e+03 2s
34 9.78729955e+09 8.50823320e+09 1.22e-02 4.19e-02 6.69e+03 2s
35 9.68576617e+09 8.66392022e+09 1.03e-02 3.33e-02 5.34e+03 2s
36 9.52086005e+09 8.74498175e+09 7.33e-03 2.71e-02 4.06e+03 2s
37 9.39270813e+09 8.81342891e+09 5.24e-03 2.11e-02 3.03e+03 2s
38 9.31977537e+09 8.89093165e+09 4.05e-03 1.42e-02 2.24e+03 2s
39 9.28747268e+09 8.93515235e+09 3.52e-03 1.50e-02 1.84e+03 2s
40 9.19958341e+09 8.95987749e+09 2.13e-03 1.39e-02 1.25e+03 2s
41 9.18346274e+09 9.00597495e+09 1.88e-03 7.92e-03 9.28e+02 2s
42 9.15370864e+09 9.01542616e+09 1.40e-03 5.50e-03 7.23e+02 2s
43 9.13921012e+09 9.02475199e+09 1.17e-03 6.53e-03 5.99e+02 2s
44 9.12361713e+09 9.03483949e+09 9.28e-04 6.32e-03 4.64e+02 2s
45 9.10934181e+09 9.04070645e+09 7.15e-04 5.89e-03 3.59e+02 2s
46 9.10230029e+09 9.04544674e+09 5.92e-04 5.14e-03 2.97e+02 2s
47 9.09216631e+09 9.04981166e+09 4.21e-04 4.29e-03 2.21e+02 2s
48 9.09061697e+09 9.05313908e+09 3.97e-04 3.69e-03 1.96e+02 2s
49 9.08751203e+09 9.05537159e+09 3.45e-04 3.11e-03 1.68e+02 2s
50 9.08094569e+09 9.05785298e+09 2.36e-04 2.60e-03 1.21e+02 2s
51 9.07781389e+09 9.06134599e+09 1.85e-04 2.08e-03 8.61e+01 2s
52 9.07379618e+09 9.06239848e+09 1.26e-04 1.89e-03 5.96e+01 2s
53 9.07197046e+09 9.06312530e+09 9.57e-05 1.65e-03 4.63e+01 2s
54 9.07025077e+09 9.06423894e+09 6.54e-05 1.27e-03 3.14e+01 2s
55 9.06867877e+09 9.06500298e+09 9.89e-04 9.11e-04 1.92e+01 2s
56 9.06777841e+09 9.06548237e+09 3.38e-03 5.81e-04 1.20e+01 3s
57 9.06723025e+09 9.06563339e+09 4.55e-03 4.88e-04 8.35e+00 3s
58 9.06704033e+09 9.06573719e+09 3.70e-03 3.80e-04 6.82e+00 3s
59 9.06692864e+09 9.06581696e+09 3.30e-03 2.92e-04 5.82e+00 3s
60 9.06667971e+09 9.06594734e+09 2.32e-03 3.39e-04 3.83e+00 3s
61 9.06658607e+09 9.06598391e+09 1.88e-03 3.72e-04 3.15e+00 3s
62 9.06644750e+09 9.06602358e+09 1.25e-03 4.01e-04 2.22e+00 3s
63 9.06636073e+09 9.06608835e+09 8.89e-04 4.11e-04 1.43e+00 3s
64 9.06629124e+09 9.06610711e+09 6.18e-04 4.12e-04 9.63e-01 3s
65 9.06624721e+09 9.06612531e+09 4.15e-04 3.51e-04 6.38e-01 3s
66 9.06622989e+09 9.06613225e+09 3.26e-04 3.04e-04 5.11e-01 3s
67 9.06621536e+09 9.06613682e+09 2.52e-04 2.65e-04 4.11e-01 3s
68 9.06619440e+09 9.06614879e+09 1.48e-04 1.61e-04 2.39e-01 3s
69 9.06618170e+09 9.06615589e+09 8.75e-05 2.05e-04 1.35e-01 3s
70 9.06617465e+09 9.06615858e+09 5.45e-05 2.31e-04 8.41e-02 3s
71 9.06616944e+09 9.06616019e+09 3.05e-05 2.51e-04 4.84e-02 4s
72 9.06616482e+09 9.06616151e+09 9.28e-06 2.50e-04 1.73e-02 4s
73 9.06616330e+09 9.06616218e+09 3.03e-06 2.09e-04 5.85e-03 4s
74 9.06616274e+09 9.06616242e+09 9.11e-07 1.58e-04 1.69e-03 4s
75 9.06616248e+09 9.06616245e+09 5.20e-08 1.71e-04 1.66e-04 4s
76 9.06616245e+09 9.06616246e+09 2.44e-07 1.80e-04 7.73e-07 4s
77 9.06616245e+09 9.06616246e+09 1.25e-08 1.83e-04 2.84e-10 4s

Barrier solved model in 77 iterations and 4.07 seconds (4.14 work units)
Optimal objective 9.06616245e+09

Crossover log...

60897 DPushes remaining with DInf 2.3298577e-03 4s
0 DPushes remaining with DInf 2.3298577e-03 4s
Warning: Markowitz tolerance tightened to 0.5

5854 PPushes remaining with PInf 0.0000000e+00 4s
0 PPushes remaining with PInf 0.0000000e+00 4s

Push phase complete: Pinf 0.0000000e+00, Dinf 3.6874893e-10 4s

Solved with barrier
Iteration Objective Primal Inf. Dual Inf. Time
47816 9.0661625e+09 0.000000e+00 0.000000e+00 5s

Solved in 47816 iterations and 4.59 seconds (5.24 work units)
Optimal objective 9.066162454e+09

Adding constraints with Linopy Functionality

Checklist

  • Using latest PyPSA version 0.22.1

Describe the Bug

Using add_constraints() to manually add ramping constraints to a link causes an error, this only occurs after such a constraint has been added and then an additional one is added to a different link defined at the same snapshot. Below is code to reproduce this error. It seems this is related to the links attribute of "p" (Link-p) not being listed as a coordinate when calling "m.constraints".

A solution to this error would be helpful and appreciated. I am not too sure if I am misusing this feature ("add_constraints()") or if it's a bug. Thanks in advance.

Code

import pypsa

load_data = [10,10] 
e_fill = 10000 # fill Nautral Gas (NG) store

# setup the network
network = pypsa.Network()
network.set_snapshots(range(len(load_data)))

network.set_snapshots(range(len(load_data)))
    
# Add carriers
network.add("Carrier", "NG", co2_emissions=1.0)

# Add buses
network.add("Bus", "elc", carrier="AC") 
network.add("Bus", "NG Bus", carrier="NG")

# Add loads
network.add("Load", "elc demand", bus="elc", p_set=load_data)


# Add NG Store
network.add(class_name="Store",
            name="NG store",
            bus="NG Bus",
            e_nom=e_fill,
            e_initial=e_fill,
)
# Add 2 links
for i in [0,1]:
    network.add(class_name="Link",
                name=f"PP {i}",
                bus0="NG Bus",
                bus1="elc",
                carrier="NG",
                p_nom=10,
                efficiency=1,
                marginal_cost=1
                )

# create model
m = network.optimize.create_model()

# manual ramp constraint 1:
m.add_constraints(m.variables["Link-p"].loc[1,'PP 0'] - m.variables["Link-p"].loc[0,'PP 0'] <= 123 , name="ramp_tester1")

# Error inducing ramp constraint 2:
m.add_constraints(m.variables["Link-p"].loc[1,'PP 1'] - m.variables["Link-p"].loc[0,'PP 1'] <= 456 , name="ramp_tester2")

Error Message

MergeError                                Traceback (most recent call last)
Cell In[1], line 49
     46 m.add_constraints(m.variables["Link-p"].loc[1,'PP 0'] - m.variables["Link-p"].loc[0,'PP 0'] <= 123 , name="ramp_tester1")
     48 # Error inducing ramp constraint 2:
---> 49 m.add_constraints(m.variables["Link-p"].loc[1,'PP 1'] - m.variables["Link-p"].loc[0,'PP 1'] <= 456 , name="ramp_tester2")

File [~/anaconda3/envs/bc-power/lib/python3.10/site-packages/linopy/model.py:617](https://vscode-remote+wsl-002bubuntu.vscode-resource.vscode-cdn.net/mnt/c/Users/pmcw9/Delta-E/PICS/PyPSA_BC/notebooks/~/anaconda3/envs/bc-power/lib/python3.10/site-packages/linopy/model.py:617), in Model.add_constraints(self, lhs, sign, rhs, name, coords, mask)
    614     rhs = rhs.chunk(self.chunk)
    615     labels = labels.chunk(self.chunk)
--> 617 self.constraints.add(name, labels, lhs.coeffs, lhs.vars, sign, rhs)
    619 return self.constraints[name]

File [~/anaconda3/envs/bc-power/lib/python3.10/site-packages/linopy/constraints.py:411](https://vscode-remote+wsl-002bubuntu.vscode-resource.vscode-cdn.net/mnt/c/Users/pmcw9/Delta-E/PICS/PyPSA_BC/notebooks/~/anaconda3/envs/bc-power/lib/python3.10/site-packages/linopy/constraints.py:411), in Constraints.add(self, name, labels, coeffs, vars, sign, rhs)
    399 def add(
    400     self,
    401     name,
   (...)
    406     rhs: DataArray,
    407 ):
    408     """
    409     Add constraint `name`.
    410     """
--> 411     self._merge_inplace("labels", labels, name, fill_value=-1)
    412     self._merge_inplace("coeffs", coeffs, name)
...
    148     )
    150 if combine_method:
    151     for var in variables[1:]:

MergeError: conflicting values for variable 'Link' on objects to be combined. You can skip this check by specifying compat='override'.

Add support of xarray's `diff` function to Variable class

The class linopy.variables.Variable inherits methods from its parent xarray.DataArray. Some of these methods seem to lead to wrong optimization results.

Example:

m = linopy.Model()
var = m.add_variables(name="var", lower=xr.DataArray(np.zeros(5), dims='x', coords={'x': np.arange(5)}))
m.add_constraints(1. * var.diff(dim='x') == 1.)
m.add_objective(var.sum())

Solving using Gurobi gives the solution: var: 0.0 1.0 0.0 0.0 0.0.
The expected solution would be: var: 0.0 1.0 2.0 3.0 4.0

Support MIP

As I can see in the docs and code there is no support for integer programming, only continuous and binary variables are supported. Is there a plan to get integer variables soon?

Thanks for great project

Relicense to MIT?

Like we have done for the other packages. It allows broader usage!

Conditional sums in constraints

Problem

I cannot work out how to implement a conditional sum using linopy and xarray syntax. One key constraint in OSeMOSYS requires the cumulative sum of new capacity as shown in the GNU MathProg syntax below:

s.t. CAa1_TotalNewCapacity{r in REGION, t in TECHNOLOGY, y in YEAR}:
	AccumulatedNewCapacity[r,t,y]
	=
	sum{yy in YEAR: y-yy < OperationalLife[r,t] && y - yy >= 0} NewCapacity[r,t,yy];

The Pyomo implementation of this constraint looks like this:

def TotalNewCapacity_1_rule(model, r, t, y):
    return model.AccumulatedNewCapacity[r, t, y] == sum(
        model.NewCapacity[r, t, yy]
        for yy in model.YEAR
        if ((y - yy < model.OperationalLife[r, t]) and (y - yy >= 0))
    )
model.TotalNewCapacity_1 = Constraint(
    model.REGION, model.TECHNOLOGY, model.YEAR, rule=TotalNewCapacity_1_rule
)

And here's the mathematical formulation from the 2011 paper:

$$ ANC_{rty} = \sum_{yy: y-yy < OL_{rt} AND y-yy >= 0} NC_{rtyy} \quad \forall r \in REGION, t \in TECHNOLOGY, y \in YEAR \\ $$

Context

Hi! Many thanks for the great library. I have been exploring porting OSeMOSYS over to use Linopy, and am struggling with the xarray/linopy syntax for the more complicated constraints. If you could help me figure out the syntax, I'd be happy to contribute an example Jupyter Notebook for the documentation.

Minimal working example

import xarray as xy
from linopy import Model
import numpy as np

data_vars = {
'OperationalLife': xy.DataArray(data=np.array([[2.0]]), dims=['REGION', 'TECHNOLOGY'], 
                                coords={'REGION': ['SIMPLICITY'], 'TECHNOLOGY': ['TD2']}, 
                                name='OperationalLife')
}
coords = {
'REGION': ['SIMPLICITY'],
'TECHNOLOGY': ['TD2'],
'YEAR': [2010, 2011, 2012, 2013]
}

# Dataset containing all the parameters read in from an OSeMOSYS file
ds = xy.Dataset(data_vars=data_vars, coords=coords)

m = Model(force_dim_names=True)

RTeY = [ds.coords['REGION'], ds.coords['TECHNOLOGY'], ds.coords['YEAR']]
accumulated_new_capacity = m.add_variables(lower=0, upper=np.inf, coords=RTeY, name='AccumulatedNewCapacity', integer=False)
new_capacity = m.add_variables(lower=0, upper=np.inf, coords=RTeY, name='NewCapacity', integer=False)

Improve readability of compute_feasibility()

At the moment, this function tells you the name of the unfeasible constraints and the corresponding variables. The name of the constraints is given as e.g. "c0" for the first constraint. It would be nice if you could quickly check to which constraint this corresponds in the model formulation. Either by indexing or directly printing the given constraint name in the compute_feasibility() method.

Any plan to add SCIP as a solver?

SCIP and its python library pyscipopt also support MILP problems with some configs. So do you have any plan to add it into the solvers list. As an open source solver, I think scip may be better than cbc and glpk in many cases.

import pyscipopt as pso
from pyscipopt import Model
model = Model("prod")
# In order to get dual, needs the following configs
model.setPresolve(pso.SCIP_PARAMSETTING.OFF)
model.setHeuristics(pso.SCIP_PARAMSETTING.OFF)
model.disablePropagation()
# LP algorithm for solving initial LP relaxations (automatic 's'implex, 'p'rimal simplex, 'd'ual simplex, 'b'arrier, barrier with 'c'rossover)
model.setCharParam("lp/initalgorithm","p")
model.setParam('limits/time', 600)


lmp = model.getDualsolLinear(power_balance_cons[bus, ti])
angle = model.getVal(bus_angle[bus, ti])

Requirements for solver use

Hi there,

First off, the package is looking great. Thanks very much for all your efforts.

The documentation states that solvers must be installed by the user, but I wonder if it is worth adding what that entails for each solver. For example, based on the lines below, it's clear that using HiGHS/Gurobi not only requires installation of the solver, but also their corresponding python packages.

These packages are currently extras in setup.py, so if the intention is to keep them optional, would it be preferable to document extra packages required for each solver?

linopy/linopy/solvers.py

Lines 22 to 55 in a7f472b

if sub.run([which, "glpsol"], stdout=sub.DEVNULL).returncode == 0:
available_solvers.append("glpk")
if sub.run([which, "cbc"], stdout=sub.DEVNULL).returncode == 0:
available_solvers.append("cbc")
try:
import gurobipy
available_solvers.append("gurobi")
except (ModuleNotFoundError, ImportError):
pass
try:
import highspy
available_solvers.append("highs")
except (ModuleNotFoundError, ImportError):
pass
try:
import cplex
available_solvers.append("cplex")
except (ModuleNotFoundError, ImportError):
pass
try:
import xpress
available_solvers.append("xpress")
except (ModuleNotFoundError, ImportError):
pass

Broken installation with solvers option

Hi there - installation with solvers option seems broken. Specifically, running this (as suggested in the docs):

pip install linopy[solvers]

results in an old version of linopy being install (v0.0.13), which cannot be imported (..e.g from linopy import Model raises an error, related to dataclasses)

Installing linopy normally (pip install linopy), results in v0.1.5 being installed. And installing solvers individually / manually results in it working fine, (so I think it's just specifically installing with solver option that is playing up).

Linopy v0.0.10 breaks with xarray v2022.06

Due to some newly introduced bugs in xarray, the most recent linopy version v0.0.10 breaks with xarray v2022.06. We recommend to downgrade xarray to v2022.03 if users encounter problems.

Shuffled constraints and variables causing infeasibilities

Dear linopy team,

I am currently using linopy 0.1.4 with PyPSA 0.22.1

While trying to solve a small optimisation model via PyPSA, I encountered some infeasibilities on a model which shouldn't have any. The issue happens in the specific case where I combine unit commitment, ramp constraints and rolling horizon - the first iteration always goes well, and the second is found infeasible.

I tried to dig into the PyPSA and linopy code and it seems like the constraints are well-written all along the way, except at the very end during the export to the lp file. In the LP file, right and left hand terms are mixed, grouping part of constraints that are not related, hence causing the infeasibilities.
When rolling horizon is disabled, it does not lead to infeasibilities but constraints are still not written properly...
However, the to_file linopy function is a bit beyond my capabilities, and it is likely I missed something upstream, too.

Here is the code used for the tests, the logs, and the dataset. Thanks a lot for your help!

n = pypsa.Network()
    n.import_from_csv_folder("main/test")

    n_periods = 1000
    for i in range(n_periods):
        print("Iteration " + str(i+1) + " over " + str(n_periods))
        begin = i * int(len(n.snapshots) / n_periods)
        end = min((i + 1) * int(len(n.snapshots) / n_periods + 1) + 2, len(n.snapshots))

        m = n.optimize.create_model(snapshots=n.snapshots[begin: end], linearized_unit_commitment=False)

        m.to_file("pypsa.lp")
        n.optimize.solve_model(solver_name='gurobi')

        try:
            se = m.compute_set_of_infeasible_constraints()
            print(se)
        except:
            print("worked")
IIS runtime: 0.00 seconds (0.00 work units)
 c112: x8 <= 0

 c113: x1 <= 10000

 c118: x4 <= 10000

 c294: x0 <= 10000

 c295: x2 <= 5000

 c296: x3 <= 5000

 c297: x5 <= 7500
[test (1).zip](https://github.com/PyPSA/linopy/files/11145974/test.1.zip)


 c299: x6 <= 7500

 c298: x7 <= 7500

 c574: x0 + x1 + x2 + x3 + x4 + x5 + x6 + x7 + x8 = 200000

Documentation on model debugging

I have been unable to find documentation specific to diagnosing issues with models.

I have found a few useful helper functions, such as model.variables.get_name_by_label() which returns the mapping from the numeric variable number found in the LP file to the variable name of the Python object.

Are there any other useful functions I should know about to help diagnose problems with an LP formulated in linopy (collating ideas here for a section in the documentation)?

Some ideas:

  • Force writing out full variable names to LP generation to aid debugging
  • Tutorial to demonstrate visualisation of the constraints, matrix, coefficient ranges etc.

Constraint with binary variable not being respected?

Hi there,

I'm comparing a few packages for solving a MILP with binary variables: https://github.com/prakaa/Battery-Optimisation-Benchmarking

I have formulated the problem to respect linopy's requirements (e.g. all variables on LHS, minimisation only) and have been able to find an appropriate solution in JuMP, pyomo and python-mip. However the solution does not appear to be right for linopy: https://github.com/prakaa/Battery-Optimisation-Benchmarking/blob/master/battery_optimisation_benchmarking/python/linopy.ipynb

Specifically, there is a pair of constraints that contain a binary variable ("charge state") that should ensure that the modelled battery cannot discharge and charge at the same time. This appears to be respected in other packages (making me think the model formulation is OK), but does not appear to be in linopy.

Have I made a mistake in formulating the model using linopy, or is this a bug?

Precision of floats in lp files

Hello,

I am currently migrating a package from pyomo to linopy (great work btw). When comparing results from the old version to the results with linopy, I noticed that the results always match pretty well but not exactly, with differences up to around ~1% for complex models.

I came to realize that the main issue was the precision in which the lp files are written. From what I understand, all of the float to string conversion is happening in

linopy/linopy/io.py

Lines 38 to 46 in 62ac3ea

def float_to_str(arr, ensure_sign=True):
"""
Convert numpy array to str typed array.
"""
if ensure_sign:
convert = np.frompyfunc(lambda f: "%+f" % f, 1, 1)
else:
convert = np.frompyfunc(lambda f: "%f" % f, 1, 1)
return convert(arr)

Is there anything against upping the precision per default, e.g. from %f -> %.12f and %f -> %+.12f respectively?
If not, it would be great if you could add this. Otherwise, I could create a PR where this is a new parameter that can be passed to the model.

Solver module import other errors

In solvers.py

try:
    import gurobipy

    available_solvers.append("gurobi")
except ModuleNotFoundError:
    pass


try:
    import cplex

    available_solvers.append("cplex")
except ModuleNotFoundError:
    pass

try:
    import xpress

    available_solvers.append("xpress")
except ModuleNotFoundError:
    pass

When we have other solver installation errors besides ModuleNotFoundError, it will stuck.

E.g. I have a legacy cplex installation, and I want to use highs, it will always stuck in the step importing cplex.

Cannot run example

In examples/create-a-model.ipynb, when I run into

m.add_constraints(3*x + 7*y, '>=', 10)
m.add_constraints(5*x + 2*y, '>=', 3);

The console shows:

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-23-72c2b3a3f44b> in <module>
----> 1 m.add_constraints(3*x + 7*y, '>=', 10)
      2 m.add_constraints(5*x + 2*y, '>=', 3);

TypeError: add_constraints() missing 1 required positional argument: 'rhs'

Python version 3.7.9
pip install linopy

Second Order Cone

Hello,

I'm opening this issue to raise attention on SOCP and quadratic modelling that may be highly interesting for some energy system applications, and I'm wondering if they have been considered for linopy as potential upgrades.

SOCP and QP are clearly non linear, so they may not suite the purpose of the package but may be highly valuable for power systems.
Moreover, many solvers now support those techniques, so I was wondering if there is any plan to support those techniques

"Unit tests" for constraint and variable generation - advice and a feature request?

From a user perspective, what suggestions do you have regarding writing unit tests for models implemented in linopy? Are there any plans to improve the opportunities for comparing constraint and variable objects with pre-generated fixtures?

One key feature I'm looking for in a modelling framework is being able to test individual components and test the integration of these components into a complete optimisation model. I think this would make the process of developing and maintaining modelling frameworks considerably more robust and efficient. The task is particularly challenging because of the intertwined nature of constraints and variables in optimisation models. It is very difficult to effectively separate individual components. In practice, this means that testing is only possible with a complete and often complex model. Debugging can be tortuous because errors in constraint definitions, masks or conditional operators etc. do not manifest in an obvious way. There are degrees of freedom outside the control of the model developer - for example, solutions to an ill-posed optimisation problem can differ between solvers, be caused by floating point errors and solver tolerances etc.

From a readability perspective, it would be great to use in an assertion the printout you get returned from the constraint construction e.g.

[SIMPLICITY, HEATER, 2010]: 1.0 AccumulatedNewCapacity[SIMPLICITY, HEATER, 2010] - 1.0 NewCapacity[SIMPLICITY, HEATER, 2010]

However, this is a little brittle, and could result in tests failing if e.g. coefficient precision was updated, or the ordering of terms in the constraint were changed. From a mathematical perspective, it doesn't matter if the order of the terms in the constraint as long as they are all present and have the correct sign and coefficient.

Another option is to manually create the linopy.Constraint using a completely separate linopy.Model instance, and then assert that the actual and expected are equal. Note that for many linopy objects __equal__ is often overridden and doesn't allow comparison.

It would also be good to think about how to use Mock effectively (e.g. to avoid creating variables when testing a constraint).

Example

Here's my new constraint, encapsulated in a function I can call at run time. Now, I would like to write a unit test, so that I can check that the constraint is correctly generated under different conditions. I might like to check corner cases, such as if there is a technology with an OperationalLife of 0 or write a regression test to ensure that behaviour doesn't change as dependencies are updated.

def capacity_adequacy_a(ds, m):
    """Add capacity adequacy constraint to the model

    Arguments
    ---------
    ds: xarray.Dataset
        The parameters dataset
    m: linopy.Model
        A linopy model

    Returns
    -------
    linopy.Model
    """
    new_cap = m['NewCapacity'].rename(YEAR="BUILDYEAR")
    mask = (ds.YEAR - new_cap.data.BUILDYEAR >= 0) & (ds.YEAR - new_cap.data.BUILDYEAR < ds.OperationalLife)
    con = m['AccumulatedNewCapacity'] - new_cap.where(mask).sum("BUILDYEAR") == 0
    m.add_constraints(con, name='CAa1_TotalNewCapacity')
    return m

And here are some options I have explored for testing:

Fixtures

Some fixtures to set up the parameters needed to test the constraint creation:

import pytest
import numpy as np
import xarray as xr

@pytest.fixture()
def operational_life():
    data = np.empty((1, 1))
    data.fill(1)

    coords = {
        'REGION': ['SIMPLICITY'],
        'TECHNOLOGY': ['HEATER'],
    }
    dims = ['REGION', 'TECHNOLOGY']
    return xr.DataArray(data, coords, dims, name='OperationalLife')

@pytest.fixture()
def coords():
    return {
        '_REGION': ['SIMPLICITY'],
        'REGION': ['SIMPLICITY'],
        'TECHNOLOGY': ['HEATER'],
        'TIMESLICE': ['1'],
        'MODE_OF_OPERATION': [1],
        'FUEL': ['ELC', 'HEAT'],
        'YEAR': [2010]
        }

Tests

A now the test itself:

import pytest
from constraints import add_demand_constraints, capacity_adequacy_a
import xarray as xr
import numpy as np
from linopy import Model

class TestCapacityAdequacyA:

    @pytest.fixture()
    def dataset(self, coords, operational_life):
        data_vars = {
            'OperationalLife': operational_life
        }

        # Dataset containing all the parameters read in from an OSeMOSYS file
        ds = xr.Dataset(data_vars=data_vars, coords=coords)

        return ds

    def test_capacity_adequacy(self, dataset):
        """Vanilla test with one technology with an operational life of 1 and 1 year
        """

        model = Model(force_dim_names=True)

        RTeY = [dataset.coords['REGION'], dataset.coords['TECHNOLOGY'], dataset.coords['YEAR']]

        model.add_variables(lower=0, upper=np.inf, coords=RTeY, name='NewCapacity', integer=False)
        model.add_variables(lower=0, upper=np.inf, coords=RTeY, name='AccumulatedNewCapacity', integer=False)

        actual = capacity_adequacy_a(dataset, model).constraints

        assert model.variables.get_name_by_label(0) == 'NewCapacity'
        assert model.variables.get_name_by_label(1) == 'AccumulatedNewCapacity'

        # This is not very readable!
        assert actual.labels.CAa1_TotalNewCapacity.shape == (1, 1, 1)
        assert (actual['CAa1_TotalNewCapacity'].vars.values == [[[[1, 0]]]]).all()
        assert (actual['CAa1_TotalNewCapacity'].coeffs.values == [[[[1, -1]]]]).all()


    def test_capacity_adequacy_longer(self, dataset):
        """Vanilla test with one technology with operational life of 3 and 5 years
        """

        dataset = dataset

        dataset['YEAR'] = range(2010, 2015)
        dataset['OperationalLife'][0, 0] = 3

        model = Model(force_dim_names=True)

        RTeY = [dataset.coords['REGION'], dataset.coords['TECHNOLOGY'], dataset.coords['YEAR']]

        model.add_variables(lower=0, upper=np.inf, coords=RTeY, name='NewCapacity', integer=False)
        model.add_variables(lower=0, upper=np.inf, coords=RTeY, name='AccumulatedNewCapacity', integer=False)

        actual = capacity_adequacy_a(dataset, model).constraints

        assert model.nvars == 10

        # A utility function to check the number of variables of each type would be handy
        for var in range(0, 5):
            assert model.variables.get_name_by_label(var) == 'NewCapacity'
        for var in range(5, 10):
            assert model.variables.get_name_by_label(var) == 'AccumulatedNewCapacity'

        # This is fragile but the most readable
        expected = """LinearExpression (REGION: 1, TECHNOLOGY: 1, YEAR: 5):
-----------------------------------------------------
[SIMPLICITY, HEATER, 2010]: 1.0 AccumulatedNewCapacity[SIMPLICITY, HEATER, 2010] - 1.0 NewCapacity[SIMPLICITY, HEATER, 2010]
[SIMPLICITY, HEATER, 2011]: 1.0 AccumulatedNewCapacity[SIMPLICITY, HEATER, 2011] - 1.0 NewCapacity[SIMPLICITY, HEATER, 2010] - 1.0 NewCapacity[SIMPLICITY, HEATER, 2011]
[SIMPLICITY, HEATER, 2012]: 1.0 AccumulatedNewCapacity[SIMPLICITY, HEATER, 2012] - 1.0 NewCapacity[SIMPLICITY, HEATER, 2010] - 1.0 NewCapacity[SIMPLICITY, HEATER, 2011] - 1.0 NewCapacity[SIMPLICITY, HEATER, 2012]
[SIMPLICITY, HEATER, 2013]: 1.0 AccumulatedNewCapacity[SIMPLICITY, HEATER, 2013] - 1.0 NewCapacity[SIMPLICITY, HEATER, 2011] - 1.0 NewCapacity[SIMPLICITY, HEATER, 2012] - 1.0 NewCapacity[SIMPLICITY, HEATER, 2013]
[SIMPLICITY, HEATER, 2014]: 1.0 AccumulatedNewCapacity[SIMPLICITY, HEATER, 2014] - 1.0 NewCapacity[SIMPLICITY, HEATER, 2012] - 1.0 NewCapacity[SIMPLICITY, HEATER, 2013] - 1.0 NewCapacity[SIMPLICITY, HEATER, 2014]"""
        assert str(actual['CAa1_TotalNewCapacity'].lhs) == expected

How to write 2D constraints while the right side is 1D.

I have a df_units dataframe and a df_time dataframe.

      is_uc  bus  p_min   p_max  ...  cold_start_cost  no_load_cost  shut_down_cost  cold_start_hours
G01a      1  B01  100.0  1000.0  ...            150.0         100.0           120.0               3.0
G01b      1  B01  100.0  1000.0  ...            150.0         100.0           120.0               3.0
G02       1  B02  100.0  1000.0  ...            150.0         100.0           120.0               3.0
G03       1  B03  100.0  1000.0  ...            150.0         100.0           120.0               3.0

I define a unit_power continuous variable 2-D, and a bx binary variable 2-D.

    bx = m.add_variables(binary=True, coords=[df_time.index, df_units.index], name='bx' )
    unit_power = m.add_variables(lower=0, coords=[df_time.index, df_units.index], name='unit_power' )

I want to define constrains, for every time bx*unit_power must greater than p_min and unit_power must less than p_max.
But the dimensions are not the same.

What's the recommended way to write such constraints?
Do I need to broadcast the right side into matrix? That seems needs an extra amount of memory.
Or do I need to write within loops?
Thanks!

m.add_constraints(bx*unit_power >= df_units_uc['p_min'], name='min_power_constraint')
m.add_constraints(unit_power * 1 <= df_units_uc['p_max'], name='max_power_constraint')

Issue with repr when using masked variables

Sorry, I don't have a working example to reproduce this error, but it should be easy enough to do. The error seems to be caused by creating a masked variable. For example:

RTiFY = [ds.coords['REGION'], ds.coords['TIMESLICE'], ds.coords['FUEL'], ds.coords['YEAR']]
mask = (ds['SpecifiedAnnualDemand'] * ds['SpecifiedDemandProfile']).notnull()
m.add_variables(lower=0, upper=inf, coords=RTiFY, name='RateOfDemand', integer=False, mask=mask)
repr(m['RateOfDemand'])
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
Cell In[26], line 1
----> 1 repr(m['RateOfDemand'])

File ~/repository/linopy/linopy/variables.py:247, in Variable.__repr__(self)
    244 trunc_string = "\n\t\t..." if i == split_at - 1 and truncate else ""
    246 if label != -1:
--> 247     vname, vcoord = self.model.variables.get_label_position(label)
    248     lower = variables[vname].lower
    249     lower = lower.sel(dictsel(vcoord, lower.dims)).item()

File ~/repository/linopy/linopy/variables.py:954, in Variables.get_label_position(self, values)
    951 start, stop = self.get_label_range(name)
    953 if value >= start and value < stop:
--> 954     index = np.unravel_index(value - start, labels.shape)
    956     # Extract the coordinates from the indices
    957     coord = {
    958         dim: labels.indexes[dim][i]
    959         for dim, i in zip(labels.dims, index)
    960     }

File <__array_function__ internals>:200, in unravel_index(*args, **kwargs)

ValueError: index 1224 is out of bounds for array with size 1224

disturbing output when importing linopy on windows without cbc/glpk as available_solvers

In solvers.py the lines 30&33 create unwanted(?) output:

if sub.run([which, "glpsol"], stdout=sub.DEVNULL).returncode == 0:
available_solvers.append("glpk")

if sub.run([which, "cbc"], stdout=sub.DEVNULL).returncode == 0:
available_solvers.append("cbc")

Everytime I import linopy on windows without cbc/glpk available, I get this twice:
INFORMATION: Es konnten keine Dateien mit dem angegebenen
Muster gefunden werden.

adding "stderr=sub.STDOUT" like so removes this output:
if sub.run([which, "glpsol"], stdout=sub.DEVNULL, stderr=sub.STDOUT).returncode == 0:
available_solvers.append("glpk")

I am not sure if this is what you want?

Highs dual values, wrong column

Hi,

Colleague found a small easily fixable bug, in the highspy interface ("run_highs") the dual values of constraints are not correct. The column solution.row_value must be replaced by solution.row_dual.

Best,
Andreas

Add more complex example to the documentation

To showcase the full set of flexibility given by the data-handling through xarray, it would be good to create more complex examples. These should include operations like:

  • selection (sel / drop_sel)
  • where / masking
  • broadcast_like
  • diff
  • shift
  • fill (bfill/ffill)
  • groupby
  • rolling
  • operations on different axis

Support Quadratic terms

A new class QuadraticExpression could be added which follows the same scheme as LinearExpression but has the data fields

  • coeffs
  • vars1
  • vars2

Arithmetic operations with lhs have to be checked rigorously.

Variable / scalar or LinearExpression / scalar is broken

Just saw this example:

https://github.com/prakaa/Battery-Optimisation-Benchmarking/blob/master/battery_optimisation_benchmarking/python/linopy.ipynb

With the following soc constraint definition (slightly cleaned)

    # intertemporal energy balance constraints
    intertemp_soc_lhs = (
        soc_mwh
        - soc_mwh.shift(datetime=1)
        - charge_mw * charge_eff * tau
        + discharge_mw / discharge_eff * tau
    ).isel(datetime=slice(1, None))
    intertemp_soc = m.add_constraints(
        intertemp_soc_lhs, "=", 0, "intemporal-energy-balance"
    )

where the operation discharge_mw / discharge_eff is bypassing the linopy definitions and directly dividing the variable labels, and does not even raise an error.

For correct operation needs to be realigned to:

    intertemp_soc_lhs = (
        soc_mwh
        - soc_mwh.shift(datetime=1)
        - charge_eff * tau * charge_mw
        + tau / discharge_eff * discharge_mw 
    ).isel(datetime=slice(1, None))

=> need implementations for __div__ and __truediv__ with a scalar on Variable and LinearExpression.

Highs solver interface does not work with latest version

Hi, I made a default model, using Highs 1.2.1 on both Win10 and Mac

import linopy
import pandas as pd
import xarray as xr

m = linopy.Model()
time = pd.Index(range(10), name='time')

x = m.add_variables(lower=0, coords=[time], name='x', )
y = m.add_variables(lower=0, coords=[time], name='y')

factor = pd.Series(time, index=time)

con1 = m.add_constraints(3*x + 7*y >= 10*factor, name='con1')
con2 = m.add_constraints(5*x + 2*y >= 3*factor, name='con2')

m.add_objective(x + 2*y)
m.solve('highs')

m.solution.to_dataframe().plot(grid=True, ylabel='Optimal Value')

On mac the error is:

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
/var/folders/nj/hhq5qdcx2hd8h_5kpp15lscw0000gn/T/ipykernel_82794/4115417935.py in <module>
     15 
     16 m.add_objective(x + 2*y)
---> 17 m.solve('highs')
     18 
     19 m.solution.to_dataframe().plot(grid=True, ylabel='Optimal Value')

~/miniconda3/lib/python3.9/site-packages/linopy/model.py in solve(self, solver_name, io_api, problem_fn, solution_fn, log_fn, basis_fn, warmstart_fn, keep_files, remote, **solver_options)
   1042         try:
   1043             func = getattr(solvers, f"run_{solver_name}")
-> 1044             res = func(
   1045                 self,
   1046                 io_api,

~/miniconda3/lib/python3.9/site-packages/linopy/solvers.py in run_highs(Model, io_api, problem_fn, solution_fn, log_fn, warmstart_fn, basis_fn, keep_files, **solver_options)
    369 
    370     dual = pd.read_fwf(io.BytesIO(dual))["Dual"]
--> 371     dual.index = Model.constraints.ravel("labels", filter_missings=True)
    372 
    373     return dict(

~/miniconda3/lib/python3.9/site-packages/pandas/core/generic.py in __setattr__(self, name, value)
   5498         try:
   5499             object.__getattribute__(self, name)
-> 5500             return object.__setattr__(self, name, value)
   5501         except AttributeError:
   5502             pass

~/miniconda3/lib/python3.9/site-packages/pandas/_libs/properties.pyx in pandas._libs.properties.AxisProperty.__set__()

~/miniconda3/lib/python3.9/site-packages/pandas/core/series.py in _set_axis(self, axis, labels, fastpath)
    557         if not fastpath:
    558             # The ensure_index call above ensures we have an Index object
--> 559             self._mgr.set_axis(axis, labels)
    560 
    561     # ndarray compatibility

~/miniconda3/lib/python3.9/site-packages/pandas/core/internals/managers.py in set_axis(self, axis, new_labels)
    214     def set_axis(self, axis: int, new_labels: Index) -> None:
    215         # Caller is responsible for ensuring we have an Index object.
--> 216         self._validate_set_axis(axis, new_labels)
    217         self.axes[axis] = new_labels
    218 

~/miniconda3/lib/python3.9/site-packages/pandas/core/internals/base.py in _validate_set_axis(self, axis, new_labels)
     55 
     56         elif new_len != old_len:
---> 57             raise ValueError(
     58                 f"Length mismatch: Expected axis has {old_len} elements, new "
     59                 f"values have {new_len} elements"

ValueError: Length mismatch: Expected axis has 22 elements, new values have 20 elements

On win10 the error is:

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
D:\Tools\Miniconda3\envs\py37\lib\site-packages\pandas\core\indexes\base.py in astype(self, dtype, copy)
    912         try:
--> 913             casted = self._values.astype(dtype, copy=copy)
    914         except (TypeError, ValueError) as err:

ValueError: cannot convert float NaN to integer

The above exception was the direct cause of the following exception:

TypeError                                 Traceback (most recent call last)
~\AppData\Local\Temp\ipykernel_3220\3173557130.py in <module>
     15 
     16 m.add_objective(x + 2*y)
---> 17 m.solve('highs')
     18 
     19 m.solution.to_dataframe().plot(grid=True, ylabel='Optimal Value')

D:\Tools\Miniconda3\envs\py37\lib\site-packages\linopy\model.py in solve(self, solver_name, io_api, problem_fn, solution_fn, log_fn, basis_fn, warmstart_fn, keep_files, remote, **solver_options)
   1051                 basis_fn,
   1052                 keep_files,
-> 1053                 **solver_options,
   1054             )
   1055         finally:

D:\Tools\Miniconda3\envs\py37\lib\site-packages\linopy\solvers.py in run_highs(Model, io_api, problem_fn, solution_fn, log_fn, warmstart_fn, basis_fn, keep_files, **solver_options)
    366 
    367     sol = pd.read_fwf(io.BytesIO(sol))
--> 368     sol = sol.set_index("Name")["Primal"].pipe(set_int_index)
    369 
    370     dual = pd.read_fwf(io.BytesIO(dual))["Dual"]

D:\Tools\Miniconda3\envs\py37\lib\site-packages\pandas\core\generic.py in pipe(self, func, *args, **kwargs)
   5428         ...  )  # doctest: +SKIP
   5429         """
-> 5430         return com.pipe(self, func, *args, **kwargs)
   5431 
   5432     # ----------------------------------------------------------------------

D:\Tools\Miniconda3\envs\py37\lib\site-packages\pandas\core\common.py in pipe(obj, func, *args, **kwargs)
    469         return func(*args, **kwargs)
    470     else:
--> 471         return func(obj, *args, **kwargs)
    472 
    473 

D:\Tools\Miniconda3\envs\py37\lib\site-packages\linopy\solvers.py in set_int_index(series)
     65     Convert string index to int index.
     66     """
---> 67     series.index = series.index.str[1:].astype(int)
     68     return series
     69 

D:\Tools\Miniconda3\envs\py37\lib\site-packages\pandas\core\indexes\base.py in astype(self, dtype, copy)
    915             raise TypeError(
    916                 f"Cannot cast {type(self).__name__} to dtype {dtype}"
--> 917             ) from err
    918         return Index(casted, name=self.name, dtype=dtype)
    919 

TypeError: Cannot cast Index to dtype int32

In both environment I tested cbc, cplex no problem. And Pypsa with nomopyomo and highs no problem. So I think may be the linopy's highs interface problem.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.