A simple pytorch optim class optimizer that is based on Gradient Descent and the use of Washout-Filters.
Figuratively, the (possibly unstable) process of running gradient descent on an objective function is treated as a "plant" and a feedback loop is added to it that uses a washout filter.