fzenke / spytorch Goto Github PK
View Code? Open in Web Editor NEWTutorial for surrogate gradient learning in spiking neural networks
Tutorial for surrogate gradient learning in spiking neural networks
I don't understand why the 'rst' variable exists. It seems to always be == 'out'. Changing to rst = out
yields same results...
def spike_fn(x):
out = torch.zeros_like(x)
out[x > 0] = 1.0
return out
...
# Here we loop over time
for t in range(nb_steps):
mthr = mem-1.0
out = spike_fn(mthr)
rst = torch.zeros_like(mem)
c = (mthr > 0)
rst[c] = torch.ones_like(mem)[c]
I couldn't find any explicit mapper between label numbers and their content. While printing the content of the extras, the first 10 positions (0-9) are occupied by English words, then the German words follow. I therefore deduce that the labels 0-9 correspond to the English partition of the dataset. Could you please confirm?
Thank you in advance.
Hi, thanks for the tutorial!
I noticed that in the tutorial 4, the recurrent weights (v1) are not updating.
Do you have any suggestion?
Hey Friedemann,
thank you for the very comprehensive tutorial! I have a question on the way the recurrence is computed in tutorial 4. If I understand the equation for the dynamics of the current correctly, the recurrence should be computed with the spiking neuron state:
mthr = mem-1.0
out = spike_fn(mthr)
h1 = h1_from_input[:,t] + torch.einsum("ab,bc->ac", (out, v1))
Instead in tutorial 4, a separate hidden state is kept, that ignores the spike function:
h1 = h1_from_input[:,t] + torch.einsum("ab,bc->ac", (h1, v1))
Is this done deliberately? Judging from simulating a few epochs, the two versions seem to perform similarly.
Thank you,
Simon
Hi Friedemann,
First of thanks a lot for these great tutorials, I've enjoyed a lot playing with them, and I've learned a lot :-)
One question: in the run_snn function, why do you bother constructing the "rst" tensor? Why don't you subtract the "out" tensor, which also contains the output spikes? I've tried, and it seems to work.
Just curious.
Best,
Tim
Hey Friedemann,
thanks for making the examples available, they look very helpful.
However, to make them fully reproducible I think that some additional information regarding the "technical dependencies" is needed.
In particular, the list of used software packages (incl. version and build variant information) plus some specification about the machine hardware (CPU arch, GPUs).
Preferably, the former could be expressed as a recipe for constructing a container (Dockerfile, or for better HPC-compatibility, a
Singularity recipe), maybe even using an explicitly versioning package manager like spack.
Cheers,
Eric
I have the impression that the spike recordings are shifted one time step in all tutorials. Could you maybe check if this is indeed the case?
From my understanding, time step 0 is recorded twice for the spikes, once during initialisation
mem = torch.zeros((batch_size, nb_hidden), device=device, dtype=dtype)
spk_rec = [mem]
and once within the simulation of time step 0:
for t in range(nb_steps):
mthr = mem-1.0
out = spike_fn(mthr)
...
spk_rec.append(out)
As a result the indeces appear shifted when comparing
print(torch.nonzero((mem_rec-1.0) > 0.0))
print(torch.nonzero(spk_rec))
Thanks,
Simon
Hello,
I belive I ran into a possible issue here.
Due to line 37 the evaluation in line 38 will always be false if one hasnt already got the uncompressed dataset.
Lines 35 to 42 in 9e91ece
If I change line 37 to:
hdf5_file_path = gz_file_path[:-3]
This works for me.
Best,
Aaron
Hello,
It was a very nice and interesting tutorial, thank you for preparing it...
tutorial1 haven't any problem, but in tutorial 2, some dtype problems occurred... after their fixation, training process was very slow on GTX 980 (I've run on this config some very deep model)... could you please explain your config, and also training time and response time?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.