GithubHelp home page GithubHelp logo

Comments (5)

PAUL-BERNARD avatar PAUL-BERNARD commented on July 21, 2024 1

Hello @DS-Liu 👋
Thank you for your issue.
Timeseries in the datasets module of ReservoirPy all have the shape (n_timesteps, n_features), as they are timeseries. Echo State Networks can be used on those timeseries to predict the following timesteps, considering the previous ones.

If you want to convert a timeseries into an (input, output) tuple for a prediction task, you can use the to_forecasting method.

from reservoirpy.

DS-Liu avatar DS-Liu commented on July 21, 2024

I'm new to reservoir computing. But I think the input of the narma task should be the u(t) sampled uniformly from [0, 0.5], am I wrong?

from reservoirpy.

PAUL-BERNARD avatar PAUL-BERNARD commented on July 21, 2024

I think the confusion comes from the naming convention between the reservoir computing litterature where u(t) usually represents the input timeseries, and u(t) in the NARMA recurrent relation which stands for "uniform", and that is simply a sample from a uniform distribution and that is not meant to be used elsewhere.

In the end, the NARMA timeseries is a single timeseries of shape (n_timesteps, 1), and you can use reservoir computing to make predictions.

Let me know if the misunderstanding persists

from reservoirpy.

DS-Liu avatar DS-Liu commented on July 21, 2024

The $n$-th order narma task is defined as
$$y_{k+1} = \alpha y_k + \beta y_k\left(\sum_{j=0}^{n-1}y_{k-j}\right) + \gamma u_{k-n+1}u_k + \delta,$$
where $(\alpha,\beta,\gamma,\delta)$ are set to $(0.3,0.05,1.5,0.1)$, respectively.

In the literatures of quantum reservoir computing, such as

  1. Harnessing Disordered-Ensemble Quantum Dynamics for Machine Learning.
  2. Boosting Computational Power through Spatial Multiplexing in Quantum Reservoir Computing.
  3. Learning nonlinear input–output maps with dissipative quantum systems.
  4. Higher-Order Quantum Reservoir Computing.
  5. Dynamical Phase Transitions in Quantum Reservoir Computing.
  6. Unifying framework for information processing in stochastically driven dynamical systems.

the input to the reservoir at time step $k$ is the $u_k$ which is sampled uniformly from [0, 0.2] to assure the stability of the narma task.

I think this is reasonable since the narma system is driven by $u_k$.

However, you mentioned that

If you want to convert a timeseries into an (input, output) tuple for a prediction task, you can use the to_forecasting method.

which means that for the narma task, the echo state at time $k$ is driven by its previous output $y_{k-1}$. This is equivalent to the case that the echo state has no input, but has an output feedback connection.

Are these literatures wrongly performed the narma task?

from reservoirpy.

PAUL-BERNARD avatar PAUL-BERNARD commented on July 21, 2024

Hello again, and thank you for your well-placed perseverance :)
Indeed, many papers use the uniformly distributed timeseries as an input for benchmarking. This will be fixed in the ReservoirPy v0.3.11.

In the meantime, if you want to use the NARMA task for your benchmarking, you can re-implement the function that returns u(t):

def narma(
    n_timesteps: int,
    order: int = 30,
    a1: float = 0.2,
    a2: float = 0.04,
    b: float = 1.5,
    c: float = 0.001,
    x0: Union[list, np.ndarray] = [0.0],
    seed: Union[int, RandomState] = None,
) -> np.ndarray:
    if seed is None:
        seed = get_seed()
    rs = rand_generator(seed)

    y = np.zeros((n_timesteps + order, 1))

    x0 = check_vector(np.atleast_2d(np.asarray(x0)))
    y[: x0.shape[0], :] = x0

    u = rs.uniform(0, 0.5, size=(n_timesteps + order, 1))
    for t in range(order, n_timesteps + order - 1):
        y[t + 1] = (
            a1 * y[t]
            + a2 * y[t] * np.sum(y[t - order : t])
            + b * u[t - order] * u[t]
            + c
        )
    return u, y[order:, :]

Sorry for my misunderstanding, and thank you again for your issue

from reservoirpy.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.