GithubHelp home page GithubHelp logo

pybop-team / pybop Goto Github PK

View Code? Open in Web Editor NEW
47.0 6.0 8.0 21.48 MB

A parameterisation and optimisation package for battery models.

Home Page: https://pybop-docs.readthedocs.io

License: BSD 3-Clause "New" or "Revised" License

Python 99.61% Shell 0.39%

pybop's Introduction

logo.svg

Python Battery Optimisation and Parameterisation

Scheduled Contributors Last Commit Python Versions from PEP 621 TOML Forks Stars Codecov Open Issues License Open in Colab Static Badge Releases

PyBOP

PyBOP provides a complete set of tools for parameterisation and optimisation of battery models, using both Bayesian and frequentist approaches, with example workflows to assist the user. PyBOP can be used to parameterise various battery models, including electrochemical and equivalent circuit models available in PyBaMM. PyBOP prioritises clear and informative diagnostics for the user, while also allowing for advanced probabilistic methods.

The diagram below shows the conceptual framework of PyBOP. This package is currently under development, so users can expect the API to evolve with future releases.

pybop_arch.svg

Installation

Within your virtual environment, install PyBOP:

pip install pybop

To install the most recent state of PyBOP, install from the develop branch,

pip install git+https://github.com/pybop-team/PyBOP.git@develop

To install a previous version of PyBOP, use the following template and replace the version number:

pip install pybop==v24.3

To check that PyBOP is installed correctly, run one of the examples in the following section. For a development installation, see the Contribution Guide. More installation information is available in our documentation and the extended installation instructions for PyBaMM.

Using PyBOP

PyBOP has two intended uses:

  1. Parameter estimation from battery test data.

  2. Design optimisation under battery manufacturing/use constraints.

These include a wide variety of optimisation problems that require careful consideration due to the choice of battery model, data availability and/or the choice of design parameters.

Notebooks

PyBOP comes with a number of example notebooks, which can be found in the examples folder. A few noteworthy ones are listed below.

Scripts

Additional script-based examples can be found in the examples directory. Some notable scripts are listed below.

Supported Methods

The table below lists the currently supported models, optimisers, and cost functions in PyBOP.

Battery Models Optimization Algorithms Cost Functions
Single Particle Model (SPM) Covariance Matrix Adaptation Evolution Strategy (CMA-ES) Sum of Squared Errors (SSE)
Single Particle Model with Electrolyte (SPMe) Particle Swarm Optimization (PSO) Root Mean Squared Error (RMSE)
Doyle-Fuller-Newman (DFN) Exponential Natural Evolution Strategy (xNES) Gaussian Log Likelihood
Many Particle Model (MPM) Separable Natural Evolution Strategy (sNES) Gaussian Log Likelihood w/ known variance
Multi-Species Multi-Reactants (MSMR) Adaptive Moment Estimation with Weight Decay (AdamW) Maximum a Posteriori (MAP)
Equivalent Circuit Models (ECM) Improved Resilient Backpropagation (iRProp-) Unscented Kalman Filter (UKF)
SciPy Minimize & Differential Evolution Gravimetric Energy Density
Gradient Descent Volumetric Energy Density
Nelder-Mead

Code of Conduct

PyBOP aims to foster a broad consortium of developers and users, building on and learning from the success of the PyBaMM community. Our values are:

  • Inclusivity and fairness (those who wish to contribute may do so, and their input is appropriately recognised)

  • Interoperability (modularity for maximum impact and inclusivity)

  • User-friendliness (putting user requirements first via user-assistance & workflows)

Contributors โœจ

Thanks goes to these wonderful people (emoji key):

Brady Planden
Brady Planden

๐Ÿš‡ โš ๏ธ ๐Ÿ’ป ๐Ÿ’ก ๐Ÿ‘€
NicolaCourtier
NicolaCourtier

๐Ÿ’ป ๐Ÿ‘€ ๐Ÿ’ก โš ๏ธ
David Howey
David Howey

๐Ÿค” ๐Ÿง‘โ€๐Ÿซ
Martin Robinson
Martin Robinson

๐Ÿค” ๐Ÿง‘โ€๐Ÿซ ๐Ÿ‘€ ๐Ÿ’ป โš ๏ธ
Ferran Brosa Planella
Ferran Brosa Planella

๐Ÿ‘€ ๐Ÿ’ป
Agriya Khetarpal
Agriya Khetarpal

๐Ÿ’ป ๐Ÿš‡ ๐Ÿ‘€
Faraday Institution
Faraday Institution

๐Ÿ’ต
UK Research and Innovation
UK Research and Innovation

๐Ÿ’ต
EU IntelLiGent Project
IntelLiGent Consortium

๐Ÿ’ต
Muhammed Nedim Sogut
Muhammed Nedim Sogut

๐Ÿ’ป

This project follows the all-contributors specifications. Contributions of any kind are welcome! See CONTRIBUTING.md for ways to get started.

pybop's People

Contributors

agriyakhetarpal avatar allcontributors[bot] avatar bradyplanden avatar brosaplanella avatar davidhowey avatar markblyth avatar martinjrobins avatar nicolacourtier avatar pre-commit-ci[bot] avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

pybop's Issues

[Bug]: Update readme.md

Python Version

N/A

Describe the bug

Readme is out of date, the following needs updating:

  • Synthetic data generation
  • Point to Contributing.md
  • Condense installation proceedure
  • Point to examples
  • Condense intro paragraphs

Steps to reproduce the behaviour

N/A

Relevant log output

No response

Add tests for varying parameter fits/optimisations

Feature description

Provides better coverage for the challenges in optimisation a cost function for differing parameter types. A good example of this is add geometric parameters into our current fitting API (see #18):

import pybop
import numpy as np

parameter_set = pybop.ParameterSet("pybamm", "Chen2020")
model = pybop.lithium_ion.SPMe(parameter_set=parameter_set)

# Fitting parameters
parameters = [
    pybop.Parameter(
        "Positive particle radius [m]",
        prior=pybop.Gaussian(1e-6, 0.0),
        bounds=[0.999e-6, 1.0001e-6],
    ),
    pybop.Parameter(
        "Positive electrode diffusivity [m2.s-1]",
        prior=pybop.Gaussian(3.43e-15, 1e-15),
        bounds=[1e-15, 5e-15],
    ),
]

sigma = 0.001
t_eval = np.arange(0, 900, 2)
values = model.predict(t_eval=t_eval)
corrupt_voltage = values["Terminal voltage [V]"].data + np.random.normal(
    0, sigma, len(t_eval)
)

dataset = [
    pybop.Dataset("Time [s]", t_eval),
    pybop.Dataset("Current function [A]", values["Current [A]"].data),
    pybop.Dataset("Terminal voltage [V]", corrupt_voltage),
]

# Generate problem, cost function, and optimisation class
problem = pybop.Problem(model, parameters, dataset)
cost = pybop.SumSquaredError(problem)
optim = pybop.Optimisation(cost, optimiser=pybop.GradientDescent)
optim.optimiser.set_learning_rate(0.025)

x, final_cost = optim.run()
print("Estimated parameters:", x)

This results in non-scalar geometric parameter when building the PyBaMM model. While we are capturing this issue in #18, adding a larger variety of fitting/optimisation tests will help us check potential edge-cases.

Motivation

No response

Possible implementation

No response

Additional context

No response

Standardised Parameterisation Benchmarks

Repeatable benchmarks for parameterisation methods should be created. This can be split between synthetic and experimental datasets; however, I think we should have varying levels of difficulty / identifiability across the datasets. It's possible that the difficulty level might end up being the division between synthetic and experimental. The high level properties that should be identifiable (and not identifiable) in these datasets:

  • Thermodynamic properties
  • Kinetic properties
  • Transport properties

Custom model structures

Feature description

Provide an example of modifying or constructing an new model.pybamm_model structure. This issue provides a step towards #3.

Motivation

No response

Possible implementation

No response

Additional context

No response

Add pre-commit with Ruff hook

Feature description

Add pre-commit to the repository to maintain code standards and formatting. This will remove the need for manual formatting and reduce code review times while improve code understandability. Current linting preference is Ruff due to performance and feature-set.

Motivation

No response

Possible implementation

No response

Additional context

No response

Add draw.io architecture diagram

We should add the draw.io diagram to the assets/ directory for version tracking and collaboration. This should make it easier for collaborators to discuss and present architecture iterations.

Add BPX support

Feature description

This issue is for adding BPX support for import/export of parameter sets. Initial thoughts on the API:

To export:

bpx = pybop.BPX(results=results, parameter_set=parameter_set)
bpx.save(file='test_params.json')

To import:

parameter_set = pybop.BPX.import(file='test_params.json')

Motivation

No response

Possible implementation

No response

Additional context

No response

Split model.simulate() functionality

Feature description

At present, model.simulate() is intended to provide functionality for both initial forward model simulations (for pybamm this includes support for experiment, parameter_set, etc.) and optimisation calls require new fit/optimisation parameter values and a time vector. To support these requirements we have discussed splitting model.simulate() into

  • model.simulate(fit_parameters, time) for optimisation calls
  • model.predict(model, parameter_set, experiment, etc.) for general forward model simulations.

Therefore, model.predict() would be used for generating synthetic data, or for more general calls where performance may be a low priority. For performant calls and after the model has been built, model.simulate() would be preferred to access a new parameter evaluation.

Motivation

No response

Possible implementation

No response

Additional context

No response

[Bug]: Module not found

Python Version

3.11.5

Describe the bug

Not all scripts are imported when pip install-ing PyBOP. The python scripts located inside subfolders, such as optimisation, cannot be seen without an __init__.py file in their location.

Steps to reproduce the behaviour

pyenv virtualenv 3.11.5 pybop-env

pyenv activate pybop-env

pip install pybamm

pip install git+https://github.com/pybop-team/PyBOP

python "PATH TO/test_parameterisation.py"

Relevant log output

ModuleNotFoundError: No module named 'pybop.optimisation'

Add self-hosted runner

Feature description

Add the Battery Intelligence Lab's Apple M2 runner to PyBOP's CI for benchmarking and ARM validation.

Motivation

As we expand PyBOP's estimation and optimisation methods, having a stable runner for benchmarking will provide a robust solution for new-users to compare and select methods.

Possible implementation

The final solution should be somwhat similar to: https://github.com/pybamm-team/pybamm-bench with a similar representation deployed like: https://pybamm-team.github.io/pybamm-bench/

Additional context

No response

[Bug]: PyBaMM submodels aren't supported

Python Version

N/A

Describe the bug

PyBOP doesn't currently pass submodel options to the PyBaMM model constructor during initialisation.

Steps to reproduce the behaviour

model = pybop.models.lithium_ion.SPM(options = {"thermal": "lumped"}) doesn't initial the model with the correct submodel.

Relevant log output

No response

Initially Supported Models

Issue to discuss the selection of the initially supported models. Potential starting points include:

  • Equivalent Circuit Models (ECM)
  • Equivalent Hydraulic Model (EHM)
  • Single Particle Model with Electrolyte (SPMe)

Update testing workflow in contributing.md

We utilise pytest instead of unittest for our testing suite. At the moment, the contributing.md document doesn't cover the workflow to run pytest without utilising Nox.

Transformations for varying scale parameters

Feature description

When optimising parameters of varying scale, we overshoot acceptable parameter ranges during fitting (for non-bounded methods). Transforming / normalising the parameter space on input to the optimiser should solve this issue. An example of this issue is:

import pybop
import numpy as np

parameter_set = pybop.ParameterSet("pybamm", "Chen2020")
model = pybop.lithium_ion.SPMe(parameter_set=parameter_set)

# Fitting parameters
parameters = [
    pybop.Parameter(
        "Positive electrode diffusivity [m2.s-1]",
        prior=pybop.Gaussian(3.43e-15, 1e-15),
        bounds=[1e-15, 5e-15],
    ),
]

sigma = 0.001
t_eval = np.arange(0, 900, 2)
values = model.predict(t_eval=t_eval)
corrupt_voltage = values["Terminal voltage [V]"].data + np.random.normal(
    0, sigma, len(t_eval)
)

dataset = [
    pybop.Dataset("Time [s]", t_eval),
    pybop.Dataset("Current function [A]", values["Current [A]"].data),
    pybop.Dataset("Terminal voltage [V]", corrupt_voltage),
]

# Generate problem, cost function, and optimisation class
problem = pybop.Problem(model, parameters, dataset)
cost = pybop.SumSquaredError(problem)
optim = pybop.Optimisation(cost, optimiser=pybop.GradientDescent)
optim.optimiser.set_learning_rate(0.025)

x, final_cost = optim.run()
print("Estimated parameters:", x)

Our implementation of Gradient Descent (#88) is unbounded and immediately select a candidate solution orders of magnitude higher than an acceptable range.

Motivation

No response

Possible implementation

No response

Additional context

No response

Update architecture diagram

Feature description

A small update to the PyBOP-architecture diagram displayed in the Readme to improve the aesthetics for dark-mode and ensure consistency between the use of terminology and symbols.

Motivation

No response

Possible implementation

No response

Additional context

No response

How to handle structural parameters

Parameters that don't affect the geometric structure of the model should be passed as inputs to the pybamm.solver class to improve performance. This will allow the built model to be updated with the appropriate parameters at each iteration without reforming the model, discretisation and solver.

If the parameters of interest have an influence on the geometric structure of the model (electrode thickness, number of particles, etc.), the above workflow must be rebuilt on each iteration. This slows down the iterations/sec of parameterisation/optimisation, but without pre-composing the design space (there is a side discussion to be had here) we need to do this to enable robust results.

Open thoughts:

  • For the structural parameter set, obtaining the gradient of the result will not be easy, and if needed will require more computationally complex methods, as the model structure changes with each iteration.
  • There is a further discussion here about the fidelity of varying the model structure during either parameterisation or optimisation. There may be side effects that need to be captured in the model construction (for multi-particle models, the discretisation distribution could be held constant between structural changes).

To do:

  • Group the parameters of supported models into the two domains of structural and non-structural.
  • Update the constructor of the parameterisation/optimisation class to include functionality for each of the parameter groups.
  • Build a parameter update method for each iteration that returns either the forward model or the simulation result to the parameterisation / optimisation method.

Add run-tests.py

Feature description

A request to add the run-tests.py script for developers who are not using nox to run the tests.

Motivation

No response

Possible implementation

No response

Additional context

No response

Remove terminal voltage output constraint from simulate(), simulateS1()

Feature description

Currently, the BaseModel() class is constrained to output "Terminal Voltage [V]" in the simultate() and simulateS1() methods. This issue is to remove this constraint and add a signal variable to the BaseModel class.

Motivation

No response

Possible implementation

self.signal = signal
...
return sol[signal].data

Additional context

No response

Cost function returns an error when a simulation terminates early

If a PyBaMM simulation reaches a voltage constraint, the model can return a shorter solution array than expected, resulting in an error in the computation of a cost function such as RootMeanSquareError.

Optimisation problems are likely to run into such solutions when trialling different parameter values, but I think it should be taken to mean that the solution has a high cost rather than causing the optimisation to stop.

Suggested fixes:

  1. Pad out any solutions that terminate early with the last output value to ensure that the prediction has the same length as the dataset.
  2. Return a cost value of Infinity in the case that the prediction is shorter than the expected output.

I would opt for fix 2 for now, as I think it would give predictable behaviour and, in the case that all predictions are incompatible, it should return a final cost of Infinity.

Add Pints optimisation methods

Feature description

This issue is to support adding Pints methods into PyBOP. Pints has multiple entry points for this, including the ForwardModel class, the Problem classes, and the OptimisationController with the underlying ask() and tell() API. We already have support for the ForwardModel definition via our BaseModel class, with an MLE example showing how PyBOP and Pints can interact.

Motivation

No response

Possible implementation

No response

Additional context

No response

Ruff format to pre-commit

Feature description

  • Add ruff format to pre-commit to replace black format usage.
  • Include jupyter notebook support
  • Add ruff.toml for configuration

Motivation

Adds black style formatting to the automated pre-commit trigger.

Possible implementation

No response

Additional context

No response

[Bug]: Prior sampling outside of bounds causes runtime error

Python Version

N/A

Describe the bug

It is possible to sample the prior outside of the parameter bounds, which causes a runtime error for nlopt during optimisation.

Steps to reproduce the behaviour

Running something like this will cause an nlopt error:

pybop.Parameter(
    "Negative electrode active material volume fraction",
    prior=pybop.Gaussian(0.75, 0.05),
    bounds=[0.73, 0.77],
)

Relevant log output

No response

Point parameter estimation methods

Overarching parent issue to cover the family of all methods that give only "point" parameter estimates (rather than distributions) - i.e., maximum likelihood estimation (MLE) including variations such as prediction error minimisation (PEM); maximum a posteriori (MAP) estimation.

Predictive Error Minimisation Implementation

Predictive error minimisation (PEM) differs from 'vanilla' maximum likelihood estimation (MLE) in the assumptions made about the observations. Basic MLE assumes that the observations are i.i.d., which is not the case for dynamic time-invariant systems. The likelihood function for MLE becomes $\text{L}(\theta; y_1, ..., y_N) = p(y_1,\theta), p(y_2,\theta)...p(y_N,\theta)$, which is separable, with no dependence on previous observations or initial conditions. However, PEM includes an assumption of sequential observations with the likelihood function forming a chain of conditional probabilities, $\text{L}(\theta, y_N) = p[y(N,\theta)|y_{N-1,\theta}]. ...p[y(2,\theta)|y_{1,\theta}] p[y(1,\theta)]$, providing a dependence on the previous observations (and states) as well as the initial condition $y(1,\theta)$ as expected.

This issue is about integrating predictive error methods into PyBOP with the following open issues/features

  • Create a solver to propagate the mean & covariance through the forward model
  • Create tests to validate the above solver
  • Use the above solver to implement predictive error minimisation for observed data and the forward model
  • Create example usage of the PEM via Jupyter notebooks

Add tests for single parameter fitting / optimisation

Feature description

Catches optimisation methods that are only built for mult-parameter optimisations (such as pints.CMAES).

Motivation

No response

Possible implementation

No response

Additional context

No response

[Bug]: Nlopt install error for Apple silicon

Python Version

N/A

Describe the bug

Installing nlopt.py on Apple silicon hardware results in a build error as shown in the log output below. Others have encountered this problem [1], and it appears that the current method of resolving it is to build nlopt from source. This is a tedious process that we shouldn't expect an end user to do.

In this case, I think we have two options:

  1. Deprecate nlopt and replace it with scipy & pints
  2. Have nlopt as an optional installation for Apple silicon hardware.

I believe they lead to roughly identical results, since it's clear that we can't rely on nlopt as a robust optimisation package in the future. My preference is option 1, but I am open to other ideas.

Steps to reproduce the behaviour

pip install nlopt on Apple silicon hardware

Relevant log output

Building wheels for collected packages: nlopt
  Building wheel for nlopt (setup.py): started
  Building wheel for nlopt (setup.py): finished with status 'error'
  error: subprocess-exited-with-error
  
  ร— python setup.py bdist_wheel did not run successfully.
  โ”‚ exit code: 1
  โ•ฐโ”€> [247 lines of output]
      running bdist_wheel
      running build
      running build_ext
      cmake version 3.27.7
      
      CMake suite maintained and supported by Kitware (kitware.com/cmake).
      CMake Deprecation Warning at CMakeLists.txt:1 (cmake_minimum_required):
        Compatibility with CMake < 3.5 will be removed from a future version of
        CMake.
      
        Update the VERSION argument <min> value or use a ...<max> suffix to tell
        CMake that the project does not need compatibility with older versions.
      
      
      -- The C compiler identification is AppleClang 14.0.3.14030022
      -- The CXX compiler identification is AppleClang 14.0.3.14030022
      -- Detecting C compiler ABI info
      -- Detecting C compiler ABI info - done
      -- Check for working C compiler: /Library/Developer/CommandLineTools/usr/bin/cc - skipped
      -- Detecting C compile features
      -- Detecting C compile features - done
      -- Detecting CXX compiler ABI info
      -- Detecting CXX compiler ABI info - done
      -- Check for working CXX compiler: /Library/Developer/CommandLineTools/usr/bin/c++ - skipped
      -- Detecting CXX compile features
      -- Detecting CXX compile features - done
      CMake Warning (dev) at CMakeLists.txt:12 (find_package):
        Policy CMP0148 is not set: The FindPythonInterp and FindPythonLibs modules
        are removed.  Run "cmake --help-policy CMP0148" for policy details.  Use
        the cmake_policy command to set the policy and suppress this warning.
      
      This warning is for project developers.  Use -Wno-dev to suppress it.
      
      -- Found PythonInterp: /Users/runner/Documents/pybop-runner/_work/PyBOP/PyBOP/.nox/unit/bin/python (found version "3.9.18")
      -- Found Python includes: /Users/runner/.pyenv/versions/3.9.18/include/python3.9
      -- Found Python libs: /Users/runner/.pyenv/versions/3.9.18/lib
      CMake Deprecation Warning at extern/nlopt/CMakeLists.txt:15 (cmake_minimum_required):
        Compatibility with CMake < 3.5 will be removed from a future version of
        CMake.
      
        Update the VERSION argument <min> value or use a ...<max> suffix to tell
        CMake that the project does not need compatibility with older versions.
      
      
      -- NLopt version 2.6.2
      -- Looking for dlfcn.h
      -- Looking for dlfcn.h - found
      -- Looking for getopt.h
      -- Looking for getopt.h - found
      -- Looking for unistd.h
      -- Looking for unistd.h - found
      -- Looking for string.h
      -- Looking for string.h - found
      -- Looking for strings.h
      -- Looking for strings.h - found
      -- Looking for inttypes.h
      -- Looking for inttypes.h - found
      -- Looking for memory.h
      -- Looking for memory.h - found
      -- Looking for stdlib.h
      -- Looking for stdlib.h - found
      -- Looking for stdint.h
      -- Looking for stdint.h - found
      -- Looking for time.h
      -- Looking for time.h - found
      -- Looking for sys/types.h
      -- Looking for sys/types.h - found
      -- Looking for sys/stat.h
      -- Looking for sys/stat.h - found
      -- Looking for sys/time.h
      -- Looking for sys/time.h - found
      -- Looking for getpid
      -- Looking for getpid - found
      -- Looking for syscall
      -- Looking for syscall - found
      -- Looking for isinf
      -- Looking for isinf - found
      -- Looking for isnan
      -- Looking for isnan - found
      -- Looking for gettimeofday
      -- Looking for gettimeofday - found
      -- Looking for qsort_r
      -- Looking for qsort_r - found
      -- Looking for time
      -- Looking for time - found
      -- Looking for copysign
      -- Looking for copysign - found
      -- Looking for stddef.h
      -- Looking for stddef.h - found
      -- Check size of uint32_t
      -- Check size of uint32_t - done
      -- Check size of unsigned int
      -- Check size of unsigned int - done
      -- Check size of unsigned long
      -- Check size of unsigned long - done
      -- Looking for sqrt in m
      -- Looking for sqrt in m - found
      -- Looking for fpclassify
      -- Looking for fpclassify - TRUE
      -- Performing Test HAVE_THREAD_LOCAL_STORAGE
      -- Performing Test HAVE_THREAD_LOCAL_STORAGE - Success
      -- Performing Test HAVE_THREAD_LOCAL_STORAGE
      -- Performing Test HAVE_THREAD_LOCAL_STORAGE - Failed
      -- Looking for __cplusplus
      -- Looking for __cplusplus - found
      -- Performing Test SUPPORTS_STDCXX11
      -- Performing Test SUPPORTS_STDCXX11 - Success
      -- Performing Test HAS_FPIC
      -- Performing Test HAS_FPIC - Success
      CMake Warning (dev) at extern/nlopt/CMakeLists.txt:306 (find_package):
        Policy CMP0148 is not set: The FindPythonInterp and FindPythonLibs modules
        are removed.  Run "cmake --help-policy CMP0148" for policy details.  Use
        the cmake_policy command to set the policy and suppress this warning.
      
      This warning is for project developers.  Use -Wno-dev to suppress it.
      
      CMake Warning (dev) at extern/nlopt/CMakeLists.txt:307 (find_package):
        Policy CMP0148 is not set: The FindPythonInterp and FindPythonLibs modules
        are removed.  Run "cmake --help-policy CMP0148" for policy details.  Use
        the cmake_policy command to set the policy and suppress this warning.
      
      This warning is for project developers.  Use -Wno-dev to suppress it.
      
      -- Found PythonLibs: /Users/runner/.pyenv/versions/3.9.18/lib (found suitable exact version "3.9.18")
      CMake Warning (dev) at extern/nlopt/cmake/FindNumPy.cmake:45 (find_package):
        Policy CMP0148 is not set: The FindPythonInterp and FindPythonLibs modules
        are removed.  Run "cmake --help-policy CMP0148" for policy details.  Use
        the cmake_policy command to set the policy and suppress this warning.
      
      Call Stack (most recent call first):
        extern/nlopt/CMakeLists.txt:308 (find_package)
      This warning is for project developers.  Use -Wno-dev to suppress it.
      
      -- Could NOT find NumPy (missing: NUMPY_INCLUDE_DIRS)
      -- Could NOT find Guile (missing: GUILE_EXECUTABLE GUILE_ROOT_DIR GUILE_INCLUDE_DIRS GUILE_LIBRARIES)
      -- Could NOT find SWIG (missing: SWIG_EXECUTABLE SWIG_DIR)
      -- Could NOT find Octave (missing: OCTAVE_EXECUTABLE OCTAVE_ROOT_DIR OCTAVE_INCLUDE_DIRS OCTAVE_LIBRARIES)
      -- Could NOT find Matlab (missing: Matlab_INCLUDE_DIRS Matlab_MEX_LIBRARY Matlab_MEX_EXTENSION Matlab_ROOT_DIR Matlab_MX_LIBRARY MX_LIBRARY MAIN_PROGRAM) (found version "NOTFOUND")
      -- Configuring done (2.9s)
      -- Generating done (0.0s)
      -- Build files have been written to: /private/var/folders/46/vvrt_nq91kj6lc8nv1_p7p6r0000gn/T/pip-install-u4t32o5w/nlopt_b73e733ab75542a888053759f4463ce5/build/temp.macosx-13.5-arm64-cpython-39
      [  3%] Generating nlopt.f
      [  3%] Generating nlopt.hpp
      CMake Deprecation Warning at /private/var/folders/46/vvrt_nq91kj6lc8nv1_p7p6r0000gn/T/pip-install-u4t32o5w/nlopt_b73e733ab75542a888053759f4463ce5/extern/nlopt/cmake/generate-cpp.cmake:1 (cmake_minimum_required):
        Compatibility with CMake < 3.5 will be removed from a future version of
        CMake.
      
        Update the VERSION argument <min> value or use a ...<max> suffix to tell
        CMake that the project does not need compatibility with older versions.
      
      
      CMake Deprecation Warning at /private/var/folders/46/vvrt_nq91kj6lc8nv1_p7p6r0000gn/T/pip-install-u4t32o5w/nlopt_b73e733ab75542a888053759f4463ce5/extern/nlopt/cmake/generate-fortran.cmake:1 (cmake_minimum_required):
        Compatibility with CMake < 3.5 will be removed from a future version of
        CMake.
      
        Update the VERSION argument <min> value or use a ...<max> suffix to tell
        CMake that the project does not need compatibility with older versions.
      
      
      [  3%] Built target generate-fortran
      [  3%] Built target generate-cpp
      [  7%] Building C object extern/nlopt/CMakeFiles/nlopt.dir/src/algs/direct/DIRect.c.o
      [  7%] Building C object extern/nlopt/CMakeFiles/nlopt.dir/src/algs/direct/direct_wrap.c.o
      [  9%] Building C object extern/nlopt/CMakeFiles/nlopt.dir/src/algs/direct/DIRserial.c.o
      [ 11%] Building C object extern/nlopt/CMakeFiles/nlopt.dir/src/algs/direct/DIRsubrout.c.o
      [ 13%] Building C object extern/nlopt/CMakeFiles/nlopt.dir/src/algs/cdirect/cdirect.c.o
      [ 15%] Building C object extern/nlopt/CMakeFiles/nlopt.dir/src/algs/cdirect/hybrid.c.o
      [ 17%] Building C object extern/nlopt/CMakeFiles/nlopt.dir/src/algs/praxis/praxis.c.o
      [ 19%] Building C object extern/nlopt/CMakeFiles/nlopt.dir/src/algs/luksan/plis.c.o
      [ 21%] Building C object extern/nlopt/CMakeFiles/nlopt.dir/src/algs/luksan/plip.c.o
      [ 23%] Building C object extern/nlopt/CMakeFiles/nlopt.dir/src/algs/luksan/pnet.c.o
      [ 25%] Building C object extern/nlopt/CMakeFiles/nlopt.dir/src/algs/luksan/mssubs.c.o
      [ 27%] Building C object extern/nlopt/CMakeFiles/nlopt.dir/src/algs/luksan/pssubs.c.o
      [ 29%] Building C object extern/nlopt/CMakeFiles/nlopt.dir/src/algs/crs/crs.c.o
      [ 31%] Building C object extern/nlopt/CMakeFiles/nlopt.dir/src/algs/mlsl/mlsl.c.o
      [ 33%] Building C object extern/nlopt/CMakeFiles/nlopt.dir/src/algs/mma/mma.c.o
      [ 35%] Building C object extern/nlopt/CMakeFiles/nlopt.dir/src/algs/mma/ccsa_quadratic.c.o
      [ 37%] Building C object extern/nlopt/CMakeFiles/nlopt.dir/src/algs/cobyla/cobyla.c.o
      [ 39%] Building C object extern/nlopt/CMakeFiles/nlopt.dir/src/algs/newuoa/newuoa.c.o
      [ 41%] Building C object extern/nlopt/CMakeFiles/nlopt.dir/src/algs/neldermead/nldrmd.c.o
      [ 43%] Building C object extern/nlopt/CMakeFiles/nlopt.dir/src/algs/neldermead/sbplx.c.o
      [ 45%] Building C object extern/nlopt/CMakeFiles/nlopt.dir/src/algs/auglag/auglag.c.o
      [ 47%] Building C object extern/nlopt/CMakeFiles/nlopt.dir/src/algs/bobyqa/bobyqa.c.o
      [ 49%] Building C object extern/nlopt/CMakeFiles/nlopt.dir/src/algs/isres/isres.c.o
      [ 50%] Building C object extern/nlopt/CMakeFiles/nlopt.dir/src/algs/slsqp/slsqp.c.o
      [ 52%] Building C object extern/nlopt/CMakeFiles/nlopt.dir/src/algs/esch/esch.c.o
      [ 54%] Building C object extern/nlopt/CMakeFiles/nlopt.dir/src/api/general.c.o
      [ 56%] Building C object extern/nlopt/CMakeFiles/nlopt.dir/src/api/options.c.o
      [ 58%] Building C object extern/nlopt/CMakeFiles/nlopt.dir/src/api/optimize.c.o
      [ 60%] Building C object extern/nlopt/CMakeFiles/nlopt.dir/src/api/deprecated.c.o
      [ 62%] Building C object extern/nlopt/CMakeFiles/nlopt.dir/src/api/f77api.c.o
      [ 64%] Building C object extern/nlopt/CMakeFiles/nlopt.dir/src/util/mt19937ar.c.o
      [ 66%] Building C object extern/nlopt/CMakeFiles/nlopt.dir/src/util/sobolseq.c.o
      [ 68%] Building C object extern/nlopt/CMakeFiles/nlopt.dir/src/util/timer.c.o
      [ 70%] Building C object extern/nlopt/CMakeFiles/nlopt.dir/src/util/stop.c.o
      [ 72%] Building C object extern/nlopt/CMakeFiles/nlopt.dir/src/util/redblack.c.o
      [ 74%] Building C object extern/nlopt/CMakeFiles/nlopt.dir/src/util/qsort_r.c.o
      [ 76%] Building C object extern/nlopt/CMakeFiles/nlopt.dir/src/util/rescale.c.o
      [ 78%] Building CXX object extern/nlopt/CMakeFiles/nlopt.dir/src/algs/stogo/global.cc.o
      [ 80%] Building CXX object extern/nlopt/CMakeFiles/nlopt.dir/src/algs/stogo/linalg.cc.o
      [ 82%] Building CXX object extern/nlopt/CMakeFiles/nlopt.dir/src/algs/stogo/local.cc.o
      [ 84%] Building CXX object extern/nlopt/CMakeFiles/nlopt.dir/src/algs/stogo/stogo.cc.o
      [ 86%] Building CXX object extern/nlopt/CMakeFiles/nlopt.dir/src/algs/stogo/tools.cc.o
      [ 88%] Building CXX object extern/nlopt/CMakeFiles/nlopt.dir/src/algs/ags/evolvent.cc.o
      [ 90%] Building CXX object extern/nlopt/CMakeFiles/nlopt.dir/src/algs/ags/solver.cc.o
      [ 92%] Building CXX object extern/nlopt/CMakeFiles/nlopt.dir/src/algs/ags/local_optimizer.cc.o
      [ 94%] Building CXX object extern/nlopt/CMakeFiles/nlopt.dir/src/algs/ags/ags.cc.o
      [ 96%] Linking CXX static library libnlopt.a
      [100%] Built target nlopt
      Traceback (most recent call last):
        File "<string>", line 2, in <module>
        File "<pip-setuptools-caller>", line 34, in <module>
        File "/private/var/folders/46/vvrt_nq91kj6lc8nv1_p7p6r0000gn/T/pip-install-u4t32o5w/nlopt_b73e733ab75542a888053759f4463ce5/setup.py", line 85, in <module>
          setup(
        File "/Users/runner/Documents/pybop-runner/_work/PyBOP/PyBOP/.nox/unit/lib/python3.9/site-packages/setuptools/__init__.py", line 103, in setup
          return distutils.core.setup(**attrs)
        File "/Users/runner/Documents/pybop-runner/_work/PyBOP/PyBOP/.nox/unit/lib/python3.9/site-packages/setuptools/_distutils/core.py", line 185, in setup
          return run_commands(dist)
        File "/Users/runner/Documents/pybop-runner/_work/PyBOP/PyBOP/.nox/unit/lib/python3.9/site-packages/setuptools/_distutils/core.py", line 201, in run_commands
          dist.run_commands()
        File "/Users/runner/Documents/pybop-runner/_work/PyBOP/PyBOP/.nox/unit/lib/python3.9/site-packages/setuptools/_distutils/dist.py", line 969, in run_commands
          self.run_command(cmd)
        File "/Users/runner/Documents/pybop-runner/_work/PyBOP/PyBOP/.nox/unit/lib/python3.9/site-packages/setuptools/dist.py", line 989, in run_command
          super().run_command(command)
        File "/Users/runner/Documents/pybop-runner/_work/PyBOP/PyBOP/.nox/unit/lib/python3.9/site-packages/setuptools/_distutils/dist.py", line 988, in run_command
          cmd_obj.run()
        File "/Users/runner/Documents/pybop-runner/_work/PyBOP/PyBOP/.nox/unit/lib/python3.9/site-packages/wheel/bdist_wheel.py", line 364, in run
          self.run_command("build")
        File "/Users/runner/Documents/pybop-runner/_work/PyBOP/PyBOP/.nox/unit/lib/python3.9/site-packages/setuptools/_distutils/cmd.py", line 318, in run_command
          self.distribution.run_command(command)
        File "/Users/runner/Documents/pybop-runner/_work/PyBOP/PyBOP/.nox/unit/lib/python3.9/site-packages/setuptools/dist.py", line 989, in run_command
          super().run_command(command)
        File "/Users/runner/Documents/pybop-runner/_work/PyBOP/PyBOP/.nox/unit/lib/python3.9/site-packages/setuptools/_distutils/dist.py", line 988, in run_command
          cmd_obj.run()
        File "/Users/runner/Documents/pybop-runner/_work/PyBOP/PyBOP/.nox/unit/lib/python3.9/site-packages/setuptools/_distutils/command/build.py", line 131, in run
          self.run_command(cmd_name)
        File "/Users/runner/Documents/pybop-runner/_work/PyBOP/PyBOP/.nox/unit/lib/python3.9/site-packages/setuptools/_distutils/cmd.py", line 318, in run_command
          self.distribution.run_command(command)
        File "/Users/runner/Documents/pybop-runner/_work/PyBOP/PyBOP/.nox/unit/lib/python3.9/site-packages/setuptools/dist.py", line 989, in run_command
          super().run_command(command)
        File "/Users/runner/Documents/pybop-runner/_work/PyBOP/PyBOP/.nox/unit/lib/python3.9/site-packages/setuptools/_distutils/dist.py", line 988, in run_command
          cmd_obj.run()
        File "/private/var/folders/46/vvrt_nq91kj6lc8nv1_p7p6r0000gn/T/pip-install-u4t32o5w/nlopt_b73e733ab75542a888053759f4463ce5/setup.py", line 28, in run
          self.build_extension(ext)
        File "/private/var/folders/46/vvrt_nq91kj6lc8nv1_p7p6r0000gn/T/pip-install-u4t32o5w/nlopt_b73e733ab75542a888053759f4463ce5/setup.py", line 70, in build_extension
          nlopt_py = next(Path(self.build_temp).rglob("nlopt.py"))
      StopIteration
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for nlopt
  Running setup.py clean for nlopt
Failed to build nlopt
ERROR: Could not build wheels for nlopt, which is required to install pyproject.toml-based projects

Notice:  A new release of pip is available: 23.2.1 -> 23.3.1
Notice:  To update, run: pip install --upgrade pip
nox > Command pip install -e . failed with exit code 1
nox > Session unit failed.
Error: Process completed with exit code 1.

Add design optimisation example

Feature description

A unit test based on a simple design optimisation problem.

Motivation

To provide a second example use case and a second unit test.

Possible implementation

No response

Additional context

No response

Workflows for parameterisation of different property types

Create recommended workflows for the order in which electrochemical parameters are fit. This could look something like:

  1. Fit thermal properties
  2. Fit OCP
  3. Fit kinetics
  4. Re-fit thermal / OCP / etc.

@NicolaCourtier has been doing a lot of work on this and has a very slick methodology that would be great to include in PyBOP.

Installation issue

Python Version

>3.6

Describe the bug

I tried to follow the instructions to install homebrew, pyenv and pyenv-virtualenv but unfortunately I encountered one of the Common build problems, specifically: https://github.com/pyenv/pyenv/wiki/Common-build-problems#error-the-python-ssl-extension-was-not-compiled-missing-the-openssl-lib

Although I was able to find a workaround from stack overflow, I suggest that we switch to the same installation instructions as PyBaMM for consistency: https://docs.pybamm.org/en/latest/source/user_guide/installation/GNU-linux.html#install-pybamm

Steps to reproduce the behaviour

I encountered the issue on the first step of the installation instructions, when trying to create a pyenv-virtualenv. I am using WSL on Windows 10.

Relevant log output

No response

Add forward simulation method

Feature description

Add a simple simulation class for synthetic data generation used in tests and benchmarking.

Motivation

No response

Possible implementation

Built in BaseModel, this would provide functionality for the pybamm models; however, internal models such as a future EHM implementation would need their own definition of the pybamm.Simulation class. Given pybop follows the pybamm model class structure, it might be relavently simplistic to use the pybamm.Simulation class (will be brittle to pybamm changes though).

 def sim(self, experiment=None, parameter_set=None):
        """
        Simulate the model
        """
        self.parameter_set = parameter_set or self.parameter_set
        if self._built_model is None:
            raise ValueError("Model must be built before simulating")
        return pybamm.Simulation(
            self._built_model,
            experiment=experiment,
            parameter_values=self.parameter_set
        )

Example usage:

model = pybop.model.lithium_ion.spm()
synthetic_sim = model.sim()
term_V =synthetic_sim.solve()["Terminal voltage [V]"].data

Additional context

No response

Parameter grouping and structural identifiability

It's clear that sometimes parameters need to be grouped as part of the estimation process. For example, it is not possible to estimate particle radius and diffusivity separately from input-output data, only diffusion time $R^2/D$. In practice, however, one would have to set an (arbitrary) radius or diffusivity and estimate the other parameter, but only report diffusion time. Some packages exist for checking the 'in principle' identifiability of parameters, notably STRIKE-GOLDD, but this is Matlab-based. We probably want to avoid this issue and instead be very careful in hand-holding users through the parameter estimation process to make them aware that not all parameters are 'free' parameters.

Hamiltonian Monte Carlo (HMC)

The No-U-Turn Hamiltonian Monte Carlo (HMC) method provide a robust method to capture a posterior distributions with efficient sampling and adaptive step size selection. Capturing the parameter posterior distributions for $\varepsilon = (\hat{y} - y$). This issue surmises the following:

  • Integrate a NUTS implementation from a common, well-supported probablistic library
  • Build corresponding test suite with code coverage
  • Build diagonostics from posterior distribution (i.e. parameter observability) as well as performance-based metrics.

Sensitivity analysis

We might consider including basic sensitivity analysis in PyBOP. This is already in PyBaMM so it would make sense to call and use the existing methods as a sanity check before going deeper on estimating parameters from data, perhaps as an initial stage in a workflow. What do you think @martinjrobins ?

[Bug]: Non-default parameter sets aren't included in built model

Python Version

N/A

Describe the bug

When using an underlying PyBaMM model (i.e. pybop.models.lithium_ion.SPM()), passing a parameterisation such as pybop.ParameterSet("pybamm", "Chen2020") on construction will not result in the built model containing the correct parameter set.

Steps to reproduce the behaviour

MWE:

import pybop
import pybamm

# Form observations
x0 = np.array([0.55, 0.63])
solution = self.getdata(x0)

observations = [
    pybop.Observed("Time [s]", solution["Time [s]"].data),
    pybop.Observed("Current function [A]", solution["Current [A]"].data),
    pybop.Observed("Voltage [V]", solution["Terminal voltage [V]"].data),
]

# Define model
model = pybop.models.lithium_ion.SPM()
model.parameter_set = pybop.ParameterSet("pybamm", "Chen2020")

# Fitting parameters
params = [
    pybop.Parameter(
        "Negative electrode active material volume fraction",
        prior=pybop.Gaussian(0.5, 0.05),
        bounds=[0.35, 0.75],
    ),
    pybop.Parameter(
        "Positive electrode active material volume fraction",
        prior=pybop.Gaussian(0.65, 0.05),
        bounds=[0.45, 0.85],
    ),
]

parameterisation = pybop.Parameterisation(
    model, observations=observations, fit_parameters=params
)

# get RMSE estimate using NLOpt
results, last_optim, num_evals = parameterisation.rmse(
    signal="Voltage [V]", method="nlopt"
)

Results in a runtime error as the built model uses the default parameter values.

In contrast, a working MWE:

import pybop
import pybamm

# Form observations
x0 = np.array([0.55, 0.63])
solution = self.getdata(x0)

observations = [
    pybop.Observed("Time [s]", solution["Time [s]"].data),
    pybop.Observed("Current function [A]", solution["Current [A]"].data),
    pybop.Observed("Voltage [V]", solution["Terminal voltage [V]"].data),
]

# Define model
model = pybop.models.lithium_ion.SPM()
model.parameter_set = model.pybamm_model.default_parameter_values

# Fitting parameters
params = [
    pybop.Parameter(
        "Negative electrode active material volume fraction",
        prior=pybop.Gaussian(0.5, 0.05),
        bounds=[0.35, 0.75],
    ),
    pybop.Parameter(
        "Positive electrode active material volume fraction",
        prior=pybop.Gaussian(0.65, 0.05),
        bounds=[0.45, 0.85],
    ),
]

parameterisation = pybop.Parameterisation(
    model, observations=observations, fit_parameters=params
)

# get RMSE estimate using NLOpt
results, last_optim, num_evals = parameterisation.rmse(
    signal="Voltage [V]", method="nlopt"
)

Where getdata() is defined as:

def getdata(self, x0):
        model = pybamm.lithium_ion.SPM()
        params = model.default_parameter_values

        params.update(
            {
                "Negative electrode active material volume fraction": x0[0],
                "Positive electrode active material volume fraction": x0[1],
            }
        )
        experiment = pybamm.Experiment(
            [
                (
                    "Discharge at 2C for 5 minutes (1 second period)",
                    "Rest for 2 minutes (1 second period)",
                    "Charge at 1C for 5 minutes (1 second period)",
                    "Rest for 2 minutes (1 second period)",
                ),
            ]
            * 2
        )
        sim = pybamm.Simulation(model, experiment=experiment, parameter_values=params)
        return sim.solve()

Relevant log output

No response

Kalman / covariance propogation solver

This issue is to decide the implementation language for the solver required for #3.

Options discussed:
Pure Python
JAX
PyBaMM's IDAKLU (C++)

Currently, JAX is the method being considered as it provides compiled performance as well as code modularity between CPU/TPU/GPU.

Relocate BaseModel Input Interpolant functionality

Feature description

The addition of the Problem class provides a better location for constructing the interpolant function.

Motivation

  • Form the Interpolant function from the input dataset class
  • Pass only the formed Interpolant function to the BaseModel on build.

Possible implementation

No response

Additional context

No response

Add testing framework

Add an automated testing framework with Github actions. Tools to consider:

  • Nox
  • Tox

Needs to:

  • Have capability to test across python versions
  • Easy to create isolated testing enviroments
  • Trigger on push, with additions to scheduled testing triggers at a later date
  • Have a low energy barrier to add new tests

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.