GithubHelp home page GithubHelp logo

ddobric / neocortexapi Goto Github PK

View Code? Open in Web Editor NEW
18.0 7.0 148.0 199.1 MB

C#.NET Implementation of Hierarchical Temporal Memory Cortical Learning Algorithm.

Home Page: https://ddobric.github.io/neocortexapi/

License: GNU Affero General Public License v3.0

C# 97.39% TypeScript 2.34% SCSS 0.01% Sass 0.01% PowerShell 0.23% Dockerfile 0.02%
spatialpooler algorithm htm ai cortex ml

neocortexapi's Introduction

license buildStatus

Introduction

This repository is the open source implementation of the Hierarchical Temporal Memory in C#/.NET Core. This repository contains set of libraries around NeoCortext API .NET Core library. NeoCortex API focuses implementation of Hierarchical Temporal Memory Cortical Learning Algorithm. Current version is first implementation of this algorithm on .NET platform. It includes the Spatial Pooler, Temporal Pooler, various encoders and CorticalNetwork algorithms. Implementation of this library aligns to existing Python and JAVA implementation of HTM. Due similarities between JAVA and C#, current API of SpatialPooler in C# is very similar to JAVA API. However the implementation of future versions will include some API changes to API style, which is additionally more aligned to C# community. This repository also cotains first experimental implementation of distributed highly scalable HTM CLA based on Actor Programming Model. The code published here is experimental code implemented during my research at daenet and Frankfurt University of Applied Sciences.

Getting started

To get started, please see this document.

References

HTM School: https://www.youtube.com/playlist?list=PL3yXMgtrZmDqhsFQzwUC9V8MeeVOQ7eZ9&app=desktop

HTM Overview: https://en.wikipedia.org/wiki/Hierarchical_temporal_memory

A Machine Learning Guide to HTM: https://numenta.com/blog/2019/10/24/machine-learning-guide-to-htm

Numenta on Github: https://github.com/numenta

HTM Community: https://numenta.org/

A deep dive in HTM Temporal Memory algorithm: https://numenta.com/assets/pdf/temporal-memory-algorithm/Temporal-Memory-Algorithm-Details.pdf

Continious Online Sequence Learning with HTM: https://www.mitpressjournals.org/doi/full/10.1162/NECO_a_00893#.WMBBGBLytE6

Papers and conference proceedings

International Journal of Artificial Intelligence and Applications

Scaling the HTM Spatial Pooler

Dobric, Pech, Ghita, Wennekers 2020. 2020 International Journal of Artificial Intelligence and Applications. Scaling the HTM Spatial Pooler. doi:10.5121/ijaia .2020.11407

AIS 2020 - 6th International Conference on Artificial Intelligence and Soft Computing (AIS 2020), Helsinki

The Parallel HTM Spatial Pooler with Actor Model

Dobric, Pech, Ghita, Wennekers 2020. 2020 AIS 2020 - 6th International Conference on Artificial Intelligence and Soft Computing, Helsinki. The Parallel HTM Spatial Pooler with Actor Model. https://aircconline.com/csit/csit1006.pdf, doi:10.5121/csit.2020.100606

Symposium on Pattern Recognition and Applications - Rome, Italy

On the Relationship Between Input Sparsity and Noise Robustness in Hierarchical Temporal Memory Spatial Pooler

Dobric, Pech, Ghita, Wennekers 2020. 2020 Symposium on Pattern Recognition and Applications. On the Relationship Between Input Sparsity and Noise Robustness in Hierarchical Temporal Memory Spatial Pooler. https://dl.acm.org/doi/10.1145/3393822.3432317. doi:10.1145/3393822.3432317

International Conference on Pattern Recognition Applications and Methods - ICPRAM 2021

Improved HTM Spatial Pooler with Homeostatic Plasticity Control (Awarded with: Best Industrial Paper)

Dobric, Pech, Ghita, Wennekers 2021. ICPRAM Vienna Improved HTM Spatial Pooler with Homeostatic Plasticity control. doi:10.5220/0010314200980106

Springer Nature - Computer Sciences

On the Importance of the Newborn Stage When Learning Patterns with the Spatial Pooler

Dobric, Pech, Ghita, Wennekers 2022. Springer Nature Computer Science Journal On the Importance of the Newborn Stage When Learning Patterns with the Spatial Pooler. https://rdcu.be/cIcoc. doi:10.1007/s42979-022-01066-4

Contribute

If your want to contribute on this project please contact us by opening an issue.

neocortexapi's People

Contributors

baotrung1309 avatar buinhatquang avatar ddobric avatar mahdiehpirmoradian avatar mounikakolisetty avatar noath2302 avatar quangbui3101 avatar sabinbajracharya avatar sahithkumar1999 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

neocortexapi's Issues

Project Submission->Noise Robustness in Spatial Similarity

Please Check The Project plan

Hello professor, please check my project plan, to see if I am on the right track or doing the wrong. My project is Approve Prediction of Multisequence Learning.

My Plan is:
• We will define the file formats for learning and testing sequences, likely CSV or JSON.
• We will plan how the system will learn from the learning file by sequences to HTM.
• Then design the testing process to read and use HTM to predict subsequences from the test file.
• We will determine whether accuracy will be calculated by comparing the predicted to the actual next elements in the test file.
• Then implement a LoadSequencesFromFile method to load and parse sequences from files.
• Then I will modify the learning method Run() to handle parsed file input.
• We will create PredictAndCalculateAccuracy to use the trained model for predictions and calculate accuracy.
• Then write RunPredictionMultiSequenceExperiment as the main method to orchestrate loading, learning, predicting, and calculating accuracy.

Spatial Similarity individual project update

Hello Professor @ddobric ,

How are you? I am Jayashree Regoti (Matriculation number:1324798) an Information Technology master student at Frankfurt University of Applied Sciences. I am doing a Spatial Similarity individual project under your guidance. As my project submission deadline is nearby (October 19th), I would like to get your suggestion on my report for this project. If you can arrange a meeting I can just summarize you my project and you can give me suggestion what to do and what not to do more.

Here is the link to my repository
https://github.com/JayashreeRegoti/neocortexapi/blob/master/NeoCortexApi/NeoCortexApi.Experiments/SpatialSimilarityExperiment.cs

Waiting for your response.

Regards,
Jayashree

Need help executing Temporal memory videos learning experiment

Dear guys I need help on executing temporal learning on encoded videos:
Frame_0 Frame_1 Frame_2 Frame_3 Frame_4 Frame_5
Those are 120x120 pictures, I have scaled them in Image Reading code to 10x10, the current output of each videos are represented in an ImageSet instances, which has each:
a IDName - name of the video,
and a ImageBinValue - list of boolean array (each boolean array is a frame in the video :length 10x10)

The Ball videos was created in Python with variables ([speed] = pixel/frame; [radius] = pixel; [vector angle in Cartesian] = degree). The script generate a videos of the ball bouncing in the window in jpg.
I expect the CLA to predict correctly the trajectory of the ball with an input of 2 consecutive frames.

Example of ImageSet:
the binary array represented each line a frame, radius 40 pixels, angle -100 degree, speed = 10 pixel/frame

R40_Angle-100_Speed10
0000000000000011111000011111100001111111000111111100011111100000111100000000000000000000000000000000
0000000000000111110000011111100011111110001111111000011111100000111100000000000000000000000000000000
0000000000000011000000111110000011111100001111110000111111000011111100000111100000000000000000000000
0000000000000010000000111110000111111000011111100001111110000111111000001111000000000000000000000000
0000000000000000000000011110000011111100001111110000111111000011111100000111100000000000000000000000
0000000000000000000000011111000001111110001111111000111111100001111110000011110000000000000000000000
0000000000000000000000001111000001111110000111111100011111110001111110000011111000000000000000000000
0000000000000000000000000111100000111111000011111100001111110000111111000001111000000000000000000000
0000000000000000000000001111000000111110000111111100011111110001111110000011111000000010000000000000
0000000000000000000000001110000001111100001111111000111111100001111110000111110000000100000000000000
0000000000000000000000011110000011111100001111110000111111000011111100001111100000001100000000000000
0000000000000000000000011100000111111000011111100001111110000111111000001111100000011000000000000000
0000000000000111100000111111000011111100001111110000111111000001111000000000000000000000000000000000
0000000000000000000000001100000011111000001111110000111111000011111100001111110000011100000000000000
0000000000000000000000000110000001111100000111111000111111100011111110000111110000001110000000000000
0000000000000000000000000010000000111110000111111000011111110001111111000011111000000111000000000000
0000000000000000000000000001000000011111000011111100001111110000111111000011111100000111100000000000
0000000000000000000000000000000000111110000111111000011111110001111111000111111000001111000000000000
0000000000000000000000000000000001111100000111111000111111100011111110000111111000001111000000000000
0000000000000000000000000000000001111000001111110000111111000011111100001111110000011110000000000000
0000000000000000000000000000000011110000011111100001111110000111111000011111100000111100000000000000
0000000000000000000000000000000001111000001111110000111111000011111100001111110000111110000000100000
0000000000000000000000000000000000111000000111110000111111100011111110000111111000011111000000010000
0000000000001111000001111110000111111000011111100001111110000011110000000000000000000000000000000000
0000000000000000000000000000000000011100000011111000011111110001111111000111111000001111100000011000
0000000000000000000000000000000000001110000011111100001111110000111111000011111100000111110000001100
0000000000000000000000000000000000011100000011111000011111110001111111000111111100001111100000011100
0000000000000000000000000000000000011000000111110000011111100011111110001111111000011111000000111000
0000000000000000000000000000000000110000001111100000111111000011111100001111110000111111000001111000
0000000000000000000000000000000000100000001111100001111110000111111000011111100001111110000011110000
0000000000000000000000000000000000000000000111100000111111000011111100001111110000111111000001111000
0000000000000000000000000000000000000000000111110000011111100011111110001111111000011111100000111100
0000000000000000000000000000000000000000000011110000011111100001111111000111111100011111100000111110
0000000000000000000000000000000000000000000001111000001111110000111111000011111100001111110000011110
0000000000000111100000111111000011111100001111110000111111000011111000000010000000000000000000000000
0000000000000000000000000000000000000000000011110000011111100001111111000111111100011111100000111110
0000000000000000000000000000000000000000000111110000011111100011111110001111111000011111100000111100
0000000000000000000000000000000000000000000111100000111111000011111100001111110000111111000001111000
0000000000000000000000000000000000100000001111100001111110000111111000011111100001111110000011110000
0000000000000000000000000000000000110000001111100000111111000011111100001111110000111111000001111000
0000000000000000000000000000000000011000000111110000011111100011111110001111111000011111000000111000
0000000000000000000000000000000000011100000011111000011111110001111111000111111100001111100000011100
0000000000000000000000000000000000001110000011111100001111110000111111000011111100000111110000001100
0000000000000000000000000000000000011100000011111000011111110001111111000111111000001111100000011000
0000000000000000000000000000000000111000000111110000111111100011111110000111111000011111000000010000
0000000000000011100000011111000011111110001111111000011111100001111100000001000000000000000000000000
0000000000000000000000000000000001111000001111110000111111000011111100001111110000111110000000100000
0000000000000000000000000000000011110000011111100001111110000111111000011111100000111100000000000000
0000000000000000000000000000000001111000001111110000111111000011111100001111110000011110000000000000
0000000000000000000000000000000001111100000111111000111111100011111110000111111000001111000000000000
0000000000000000000000000000000000111110000111111000011111110001111111000111111000001111000000000000
0000000000000000000000000001000000011111000011111100001111110000111111000011111100000111100000000000
0000000000000000000000000010000000111110000111111000011111110001111111000011111000000111000000000000
0000000000000000000000000110000001111100000111111000111111100011111110000111110000001110000000000000
0000000000000000000000001100000011111000001111110000111111000011111100001111110000011100000000000000
0000000000000000000000011100000111111000011111100001111110000111111000001111100000011000000000000000
0000000000000001110000001111100001111111000111111100011111100000111110000001100000000000000000000000
0000000000000000000000011110000011111100001111110000111111000011111100001111100000001100000000000000
0000000000000000000000001110000001111100001111111000111111100001111110000111110000000100000000000000
0000000000000000000000001111000000111110000111111100011111110001111110000011111000000010000000000000
0000000000000000000000000111100000111111000011111100001111110000111111000001111000000000000000000000
0000000000000000000000001111000001111110000111111100011111110001111110000011111000000000000000000000
0000000000000000000000011111000001111110001111111000111111100001111110000011110000000000000000000000
0000000000000000000000011110000011111100001111110000111111000011111100000111100000000000000000000000
0000000000000010000000111110000111111000011111100001111110000111111000001111000000000000000000000000
0000000000000011000000111110000011111100001111110000111111000011111100000111100000000000000000000000
0000000000000001100000011111000001111110001111111000111111100001111100000011100000000000000000000000
0000000000000000111000001111110000111111000011111100001111110000011111000000110000000000000000000000
0000000000000001110000001111100001111111000111111100011111110000111110000001110000000000000000000000
0000000000000000111000001111110000111111000011111100001111110000011111000000110000000000000000000000
0000000000000001110000001111100001111111000111111100011111100000111110000001100000000000000000000000
0000000000000011100000011111000011111110001111111000011111100001111100000001000000000000000000000000
0000000000000111100000111111000011111100001111110000111111000011111000000010000000000000000000000000
0000000000001111000001111110000111111000011111100001111110000011110000000000000000000000000000000000
0000000000000111100000111111000011111100001111110000111111000001111000000000000000000000000000000000
0000000000000111110000011111100011111110001111111000011111100000111100000000000000000000000000000000
0000000000000011111000011111100001111111000111111100011111100000111100000000000000000000000000000000
0000000100000001111100001111110000111111000011111100001111110000011110000000000000000000000000000000
0000000000000001110000001111100001111111000111111100011111110000111110000001110000000000000000000000
0000000000000011111000011111100001111111000111111100011111100000111100000000000000000000000000000000
0000000000000111110000011111100011111110001111111000011111100000111100000000000000000000000000000000
0000000000000111100000111111000011111100001111110000111111000001111000000000000000000000000000000000
0000000000001111000001111110000111111000011111100001111110000011110000000000000000000000000000000000
0000000000000111100000111111000011111100001111110000111111000011111000000010000000000000000000000000

Investigate slow activation of th the second half of dataset on learning of the SP

Cycle 0

Mini-Columns > 50 are not activated:

[cycle=0000, i=42, cols=:16 s=-1] SDR: 629, 719, 738, 776, 890, 897, 926, 931, 941, 962, 1016, 1031, 1032, 1038, 1062, 1105, 
[cycle=0000, i=43, cols=:15 s=-1] SDR: 776, 792, 897, 926, 931, 941, 962, 994, 1016, 1031, 1032, 1038, 1105, 1124, 1125, 
[cycle=0000, i=44, cols=:19 s=-1] SDR: 776, 792, 886, 897, 926, 931, 941, 952, 962, 994, 999, 1016, 1031, 1032, 1038, 1105, 1124, 1125, 1127, 
[cycle=0000, i=45, cols=:19 s=-1] SDR: 776, 792, 886, 889, 897, 926, 931, 941, 952, 962, 988, 994, 999, 1016, 1031, 1032, 1105, 1119, 1124, 
[cycle=0000, i=46, cols=:18 s=-1] SDR: 776, 796, 886, 889, 897, 941, 962, 988, 994, 999, 1016, 1019, 1031, 1042, 1119, 1124, 1156, 1193, 
[cycle=0000, i=47, cols=:14 s=-1] SDR: 772, 776, 889, 931, 941, 962, 988, 999, 1019, 1031, 1042, 1124, 1156, 1193, 
[cycle=0000, i=48, cols=:12 s=-1] SDR: 772, 860, 962, 988, 1029, 1031, 1042, 1092, 1156, 1193, 1213, 1222, 
[cycle=0000, i=49, cols=:11 s=-1] SDR: 772, 921, 930, 988, 1029, 1042, 1092, 1201, 1213, 1222, 1246, 
[cycle=0000, i=50, cols=:12 s=-1] SDR: 772, 800, 810, 860, 872, 988, 1029, 1092, 1156, 1201, 1246, 1299, 
[cycle=0000, i=51, cols=:0 s=-1] SDR: 
[cycle=0000, i=52, cols=:0 s=-1] SDR: 
[cycle=0000, i=53, cols=:0 s=-1] SDR: 
[cycle=0000, i=54, cols=:0 s=-1] SDR: 
[cycle=0000, i=55, cols=:0 s=-1] SDR: 
[cycle=0000, i=56, cols=:0 s=-1] SDR: 
[cycle=0000, i=57, cols=:0 s=-1] SDR: 

Cycle 32

Still not activated:


[cycle=0032, i=41, cols=:15 s=93,33333333333333] SDR: 609, 623, 640, 651, 717, 723, 812, 865, 876, 916, 944, 947, 1010, 1022, 1041, 
[cycle=0032, i=42, cols=:11 s=100] SDR: 640, 651, 723, 812, 876, 916, 919, 944, 947, 1022, 1041, 
[cycle=0032, i=43, cols=:9 s=100] SDR: 651, 723, 812, 865, 876, 919, 944, 1022, 1041, 
[cycle=0032, i=44, cols=:8 s=100] SDR: 723, 812, 826, 865, 876, 944, 1022, 1041, 
[cycle=0032, i=45, cols=:8 s=100] SDR: 723, 812, 826, 865, 876, 944, 1022, 1041, 
[cycle=0032, i=46, cols=:4 s=100] SDR: 812, 826, 944, 1041, 
[cycle=0032, i=47, cols=:8 s=100] SDR: 781, 806, 812, 826, 919, 944, 1010, 1041, 
[cycle=0032, i=48, cols=:5 s=100] SDR: 806, 812, 826, 919, 1041, 
[cycle=0032, i=49, cols=:5 s=100] SDR: 806, 812, 826, 919, 1041, 
[cycle=0032, i=50, cols=:2 s=100] SDR: 806, 812, 
[cycle=0032, i=51, cols=:0 s=-1] SDR: 
[cycle=0032, i=52, cols=:0 s=-1] SDR: 
[cycle=0032, i=53, cols=:0 s=-1] SDR: 
[cycle=0032, i=54, cols=:0 s=-1] SDR: 
[cycle=0032, i=55, cols=:0 s=-1] SDR: 
[cycle=0032, i=56, cols=:0 s=-1] SDR: 

Cycle 39

Still not activated, but i=50 activates a new mini-column 888.

[cycle=0039, i=41, cols=:15 s=93,75] SDR: 609, 640, 650, 651, 717, 723, 812, 865, 876, 916, 944, 947, 1010, 1022, 1041, 
[cycle=0039, i=42, cols=:12 s=100] SDR: 640, 650, 651, 723, 812, 876, 916, 919, 944, 947, 1022, 1041, 
[cycle=0039, i=43, cols=:10 s=100] SDR: 650, 651, 723, 812, 865, 876, 919, 944, 1022, 1041, 
[cycle=0039, i=44, cols=:8 s=100] SDR: 723, 812, 826, 865, 876, 944, 1022, 1041, 
[cycle=0039, i=45, cols=:9 s=88,88888888888889] SDR: 723, 812, 826, 865, 876, 888, 944, 1022, 1041, 
[cycle=0039, i=46, cols=:4 s=100] SDR: 812, 826, 944, 1041, 
[cycle=0039, i=47, cols=:9 s=88,88888888888889] SDR: 781, 806, 812, 826, 888, 919, 944, 1010, 1041, 
[cycle=0039, i=48, cols=:6 s=83,33333333333334] SDR: 806, 812, 826, 888, 919, 1041, 
[cycle=0039, i=49, cols=:6 s=83,33333333333334] SDR: 806, 812, 826, 888, 919, 1041, 
[cycle=0039, i=50, cols=:3 s=66,66666666666666] SDR: 806, 812, **888**, 
[cycle=0039, i=51, cols=:0 s=-1] SDR: 
[cycle=0039, i=52, cols=:0 s=-1] SDR: 
[cycle=0039, i=53, cols=:0 s=-1] SDR: 
[cycle=0039, i=54, cols=:0 s=-1] SDR: 
[cycle=0039, i=55, cols=:0 s=-1] SDR: 
[cycle=0039, i=56, cols=:0 s=-1] SDR: 

Cycle 40

Activation of mini-columns start for columns > 50


[cycle=0040, i=44, cols=:8 s=100] SDR: 723, 812, 826, 865, 876, 944, 1022, 1041, 
[cycle=0040, i=45, cols=:9 s=100] SDR: 723, 812, 826, 865, 876, 888, 944, 1022, 1041, 
[cycle=0040, i=46, cols=:4 s=100] SDR: 812, 826, 944, 1041, 
[cycle=0040, i=47, cols=:9 s=100] SDR: 781, 806, 812, 826, 888, 919, 944, 1010, 1041, 
[cycle=0040, i=48, cols=:6 s=100] SDR: 806, 812, 826, 888, 919, 1041, 
[cycle=0040, i=49, cols=:6 s=100] SDR: 806, 812, 826, 888, 919, 1041, 
[cycle=0040, i=50, cols=:3 s=100] SDR: 806, 812, 888, 
[cycle=0040, i=51, cols=:4 s=-1] SDR: 806, 812, 888, 916, 
[cycle=0040, i=52, cols=:3 s=-1] SDR: 865, 919, 1041, 
[cycle=0040, i=53, cols=:4 s=-1] SDR: 876, 919, 1010, 1041, 
[cycle=0040, i=54, cols=:5 s=-1] SDR: 865, 876, 919, 957, 1041, 
[cycle=0040, i=55, cols=:4 s=-1] SDR: 865, 876, 947, 957, 
[cycle=0040, i=56, cols=:2 s=-1] SDR: 947, 957, 
[cycle=0040, i=57, cols=:3 s=-1] SDR: 947, 957, 1041, 
[cycle=0040, i=58, cols=:2 s=-1] SDR: 957, 1022, 
[cycle=0040, i=59, cols=:3 s=-1] SDR: 957, 1022, 1041, 
[cycle=0040, i=60, cols=:2 s=-1] SDR: 957, 1022, 
[cycle=0040, i=61, cols=:2 s=-1] SDR: 1022, 1041, 
[cycle=0040, i=62, cols=:2 s=-1] SDR: 1022, 1041, 
[cycle=0040, i=63, cols=:2 s=-1] SDR: 1022, 1041, 
[cycle=0040, i=64, cols=:0 s=-1] SDR: 
[cycle=0040, i=65, cols=:0 s=-1] SDR: 
[cycle=0040, i=66, cols=:0 s=-1] SDR: 
[cycle=0040, i=67, cols=:0 s=-1] SDR: 
[cycle=0040, i=68, cols=:0 s=-1] SDR: 
[cycle=0040, i=69, cols=:0 s=-1] SDR: 

Cyle 43

Activating

[cycle=0043, i=42, cols=:41 s=100] SDR: 622, 631, 637, 658, 682, 691, 697, 700, 707, 723, 745, 776, 780, 785, 803, 827, 828, 830, 843, 844, 875, 876, 880, 891, 897, 923, 925, 926, 931, 941, 944, 956, 962, 974, 1007, 1027, 1032, 1038, 1072, 1082, 1105, 
[cycle=0043, i=43, cols=:41 s=100] SDR: 656, 658, 691, 700, 707, 723, 745, 760, 775, 776, 780, 783, 801, 803, 830, 843, 844, 853, 861, 875, 891, 896, 897, 915, 926, 931, 937, 941, 956, 975, 995, 1007, 1027, 1032, 1038, 1050, 1072, 1074, 1105, 1115, 1124, 
[cycle=0043, i=44, cols=:41 s=100] SDR: 658, 691, 700, 707, 713, 723, 760, 776, 780, 783, 830, 843, 844, 853, 861, 878, 886, 889, 891, 895, 896, 897, 902, 922, 926, 941, 975, 981, 995, 1004, 1007, 1016, 1027, 1032, 1057, 1072, 1074, 1105, 1115, 1124, 1136, 
[cycle=0043, i=45, cols=:41 s=100] SDR: 676, 687, 691, 754, 760, 769, 776, 780, 783, 812, 826, 830, 844, 853, 857, 861, 865, 878, 882, 886, 889, 891, 896, 897, 899, 941, 956, 960, 961, 962, 975, 988, 1007, 1027, 1031, 1045, 1057, 1132, 1136, 1139, 1171, 
[cycle=0043, i=46, cols=:41 s=100] SDR: 691, 779, 780, 796, 810, 812, 826, 830, 853, 861, 871, 878, 882, 883, 889, 915, 937, 938, 941, 960, 963, 965, 974, 975, 988, 995, 1007, 1019, 1027, 1029, 1031, 1041, 1042, 1044, 1045, 1057, 1070, 1085, 1139, 1156, 1171, 
[cycle=0043, i=47, cols=:41 s=100] SDR: 746, 772, 780, 783, 800, 810, 811, 812, 830, 853, 889, 896, 915, 919, 934, 937, 941, 960, 965, 975, 980, 986, 988, 995, 999, 1007, 1019, 1021, 1027, 1029, 1031, 1040, 1041, 1042, 1070, 1085, 1089, 1091, 1139, 1145, 1207, 
[cycle=0043, i=48, cols=:41 s=100] SDR: 772, 779, 780, 800, 806, 810, 811, 812, 818, 831, 841, 843, 857, 864, 896, 915, 919, 934, 937, 960, 962, 965, 975, 980, 986, 988, 995, 1007, 1021, 1025, 1026, 1029, 1030, 1036, 1041, 1087, 1089, 1139, 1155, 1169, 1207, 
[cycle=0043, i=49, cols=:41 s=100] SDR: 772, 779, 800, 806, 810, 811, 812, 823, 841, 844, 863, 864, 878, 883, 888, 896, 921, 922, 934, 937, 960, 987, 988, 1002, 1007, 1021, 1026, 1029, 1036, 1040, 1047, 1080, 1088, 1089, 1092, 1112, 1139, 1169, 1207, 1219, 1258, 
[cycle=0043, i=50, cols=:41 s=100] SDR: 779, 800, 806, 810, 811, 812, 823, 835, 858, 863, 864, 872, 883, 888, 934, 937, 940, 943, 966, 1014, 1026, 1029, 1040, 1044, 1067, 1075, 1080, 1088, 1089, 1092, 1103, 1111, 1112, 1117, 1122, 1139, 1155, 1163, 1169, 1258, 1262, 
[cycle=0043, i=51, cols=:41 s=100] SDR: 800, 806, 810, 811, 823, 835, 841, 858, 863, 864, 870, 901, 914, 934, 937, 940, 943, 966, 976, 987, 1014, 1025, 1026, 1029, 1040, 1051, 1067, 1075, 1088, 1089, 1112, 1117, 1139, 1155, 1163, 1169, 1219, 1250, 1258, 1262, 1275, 
[cycle=0043, i=52, cols=:41 s=100] SDR: 810, 811, 835, 858, 863, 864, 870, 901, 906, 919, 924, 934, 937, 940, 943, 966, 987, 996, 1002, 1005, 1014, 1015, 1024, 1025, 1075, 1077, 1088, 1089, 1090, 1103, 1112, 1117, 1128, 1139, 1153, 1155, 1258, 1262, 1263, 1275, 1284, 
[cycle=0043, i=53, cols=:41 s=100] SDR: 845, 858, 863, 864, 870, 879, 901, 904, 906, 919, 924, 934, 943, 966, 977, 978, 987, 996, 1014, 1015, 1024, 1025, 1034, 1047, 1067, 1075, 1077, 1088, 1090, 1103, 1112, 1139, 1155, 1158, 1162, 1196, 1199, 1211, 1240, 1262, 1301, 
[cycle=0043, i=54, cols=:41 s=100] SDR: 845, 858, 863, 864, 870, 879, 901, 906, 919, 933, 943, 966, 987, 996, 1014, 1017, 1025, 1033, 1034, 1063, 1064, 1066, 1067, 1069, 1077, 1088, 1089, 1090, 1092, 1093, 1103, 1112, 1128, 1140, 1146, 1196, 1211, 1262, 1268, 1313, 1327, 
[cycle=0043, i=55, cols=:41 s=100] SDR: 876, 879, 901, 924, 933, 936, 943, 945, 957, 978, 981, 987, 996, 1001, 1014, 1017, 1033, 1034, 1044, 1061, 1063, 1064, 1066, 1067, 1069, 1077, 1090, 1100, 1131, 1145, 1146, 1160, 1175, 1190, 1196, 1199, 1211, 1215, 1240, 1268, 1356, 
[cycle=0043, i=56, cols=:41 s=100] SDR: 901, 904, 936, 946, 953, 957, 961, 966, 987, 991, 1001, 1005, 1017, 1027, 1028, 1033, 1034, 1043, 1061, 1063, 1069, 1072, 1077, 1131, 1135, 1145, 1146, 1156, 1160, 1175, 1190, 1196, 1199, 1211, 1240, 1241, 1262, 1305, 1337, 1356, 1375, 
[cycle=0043, i=57, cols=:41 s=100] SDR: 901, 924, 936, 946, 953, 957, 960, 961, 973, 1001, 1017, 1028, 1033, 1034, 1047, 1063, 1069, 1077, 1090, 1100, 1111, 1130, 1131, 1135, 1145, 1146, 1150, 1156, 1160, 1175, 1178, 1190, 1191, 1196, 1211, 1240, 1241, 1262, 1333, 1356, 1389, 
[cycle=0043, i=58, cols=:42 s=100] SDR: 901, 936, 946, 953, 957, 961, 973, 981, 991, 997, 1002, 1014, 1017, 1025, 1034, 1047, 1069, 1100, 1111, 1128, 1131, 1145, 1146, 1150, 1156, 1160, 1175, 1184, 1190, 1196, 1211, 1232, 1238, 1239, 1240, 1241, 1309, 1333, 1344, 1389, 1403, 1431, 
[cycle=0043, i=59, cols=:41 s=100] SDR: 946, 957, 963, 975, 981, 982, 991, 997, 1002, 1008, 1017, 1020, 1022, 1047, 1068, 1069, 1100, 1115, 1128, 1131, 1143, 1145, 1146, 1150, 1156, 1161, 1165, 1180, 1184, 1190, 1196, 1215, 1220, 1226, 1239, 1241, 1333, 1344, 1403, 1431, 1432, 
[cycle=0043, i=60, cols=:41 s=100] SDR: 963, 975, 981, 985, 991, 997, 1000, 1017, 1022, 1047, 1048, 1054, 1069, 1107, 1115, 1131, 1132, 1145, 1150, 1161, 1165, 1167, 1168, 1176, 1180, 1184, 1190, 1215, 1233, 1239, 1273, 1300, 1333, 1364, 1375, 1391, 1403, 1418, 1431, 1434, 1456, 
[cycle=0043, i=61, cols=:41 s=100] SDR: 985, 997, 1000, 1006, 1022, 1039, 1044, 1047, 1048, 1060, 1115, 1122, 1132, 1145, 1150, 1151, 1161, 1167, 1168, 1174, 1179, 1180, 1200, 1204, 1215, 1225, 1238, 1239, 1254, 1273, 1281, 1295, 1318, 1333, 1364, 1373, 1418, 1456, 1466, 1471, 1491, 
[cycle=0043, i=62, cols=:41 s=100] SDR: 997, 1000, 1006, 1022, 1034, 1039, 1047, 1048, 1050, 1060, 1083, 1084, 1118, 1119, 1121, 1131, 1136, 1145, 1154, 1161, 1167, 1174, 1175, 1180, 1181, 1185, 1204, 1225, 1237, 1238, 1259, 1267, 1290, 1318, 1320, 1333, 1418, 1456, 1471, 1483, 1491, 
[cycle=0043, i=63, cols=:41 s=100] SDR: 1034, 1039, 1048, 1050, 1071, 1083, 1084, 1088, 1107, 1121, 1128, 1136, 1161, 1167, 1174, 1181, 1198, 1200, 1204, 1207, 1220, 1237, 1239, 1263, 1267, 1277, 1286, 1290, 1292, 1293, 1333, 1336, 1364, 1373, 1418, 1435, 1456, 1483, 1491, 1499, 1512, 
[cycle=0043, i=64, cols=:41 s=100] SDR: 1039, 1048, 1050, 1071, 1083, 1084, 1088, 1120, 1121, 1128, 1154, 1161, 1167, 1174, 1179, 1181, 1198, 1200, 1204, 1207, 1220, 1237, 1239, 1263, 1264, 1267, 1277, 1286, 1290, 1318, 1322, 1333, 1336, 1357, 1374, 1418, 1456, 1483, 1491, 1517, 1525, 
[cycle=0043, i=65, cols=:42 s=100] SDR: 1039, 1048, 1084, 1088, 1102, 1120, 1154, 1161, 1164, 1167, 1170, 1181, 1197, 1200, 1203, 1204, 1216, 1237, 1239, 1265, 1277, 1286, 1290, 1299, 1301, 1318, 1320, 1322, 1333, 1334, 1336, 1340, 1374, 1420, 1481, 1483, 1491, 1502, 1512, 1517, 1521, 1563, 
[cycle=0043, i=66, cols=:41 s=100] SDR: 1084, 1102, 1119, 1120, 1146, 1154, 1159, 1161, 1164, 1167, 1179, 1181, 1197, 1200, 1216, 1217, 1264, 1265, 1267, 1276, 1277, 1286, 1290, 1299, 1301, 1320, 1322, 1326, 1333, 1334, 1357, 1420, 1491, 1502, 1510, 1512, 1517, 1521, 1557, 1559, 1563, 
[cycle=0043, i=67, cols=:41 s=100] SDR: 1102, 1103, 1119, 1120, 1142, 1146, 1149, 1159, 1164, 1167, 1175, 1178, 1179, 1181, 1197, 1200, 1216, 1217, 1225, 1237, 1263, 1265, 1267, 1274, 1276, 1277, 1299, 1301, 1320, 1357, 1376, 1411, 1420, 1462, 1483, 1502, 1509, 1521, 1559, 1576, 1594, 
[cycle=0043, i=68, cols=:41 s=100] SDR: 1120, 1130, 1146, 1148, 1182, 1186, 1187, 1197, 1200, 1216, 1223, 1237, 1249, 1251, 1260, 1263, 1276, 1277, 1280, 1285, 1290, 1293, 1299, 1301, 1318, 1320, 1326, 1357, 1361, 1365, 1370, 1376, 1409, 1411, 1462, 1482, 1521, 1557, 1559, 1576, 1594, 
[cycle=0043, i=69, cols=:41 s=100] SDR: 1146, 1161, 1175, 1177, 1182, 1186, 1187, 1197, 1200, 1216, 1223, 1237, 1245, 1253, 1263, 1272, 1280, 1285, 1293, 1301, 1304, 1318, 1320, 1346, 1357, 1361, 1365, 1366, 1369, 1370, 1409, 1411, 1415, 1420, 1509, 1536, 1559, 1563, 1576, 1594, 1621, 
[cycle=0043, i=70, cols=:41 s=100] SDR: 1173, 1182, 1186, 1197, 1200, 1218, 1236, 1245, 1249, 1253, 1277, 1280, 1285, 1300, 1301, 1318, 1346, 1357, 1361, 1365, 1368, 1370, 1376, 1379, 1384, 1393, 1409, 1411, 1420, 1423, 1424, 1428, 1432, 1442, 1445, 1463, 1506, 1559, 1572, 1601, 1621, 
[cycle=0043, i=71, cols=:41 s=100] SDR: 1182, 1186, 1197, 1200, 1218, 1236, 1241, 1245, 1249, 1253, 1277, 1285, 1300, 1320, 1346, 1357, 1361, 1365, 1368, 1370, 1376, 1379, 1384, 1392, 1393, 1409, 1411, 1415, 1420, 1423, 1424, 1428, 1432, 1436, 1442, 1445, 1470, 1484, 1559, 1572, 1631, 
[cycle=0043, i=72, cols=:41 s=100] SDR: 1182, 1216, 1218, 1236, 1240, 1241, 1245, 1250, 1253, 1277, 1280, 1285, 1300, 1304, 1310, 1342, 1346, 1357, 1361, 1363, 1365, 1368, 1370, 1375, 1376, 1384, 1392, 1393, 1398, 1402, 1409, 1424, 1432, 1445, 1484, 1511, 1514, 1572, 1588, 1617, 1638, 
[cycle=0043, i=73, cols=:41 s=100] SDR: 1218, 1236, 1240, 1241, 1245, 1249, 1250, 1252, 1253, 1266, 1272, 1277, 1286, 1300, 1304, 1310, 1346, 1357, 1363, 1365, 1368, 1369, 1370, 1384, 1409, 1424, 1428, 1432, 1443, 1445, 1476, 1511, 1534, 1552, 1572, 1588, 1617, 1638, 1684, 1686, 1694, 
[cycle=0043, i=74, cols=:41 s=100] SDR: 1236, 1245, 1249, 1253, 1256, 1257, 1272, 1277, 1279, 1286, 1293, 1300, 1304, 1306, 1310, 1318, 1337, 1357, 1363, 1365, 1369, 1370, 1372, 1384, 1409, 1413, 1428, 1459, 1476, 1480, 1511, 1513, 1532, 1542, 1552, 1601, 1617, 1638, 1684, 1694, 1725, 
[cycle=0043, i=75, cols=:41 s=100] SDR: 1257, 1272, 1277, 1279, 1280, 1283, 1286, 1293, 1304, 1310, 1321, 1357, 1363, 1365, 1370, 1380, 1384, 1401, 1402, 1409, 1413, 1422, 1423, 1428, 1429, 1459, 1470, 1472, 1476, 1478, 1514, 1532, 1534, 1552, 1617, 1638, 1658, 1684, 1693, 1694, 1725, 
[cycle=0043, i=76, cols=:41 s=100] SDR: 1272, 1277, 1279, 1280, 1283, 1286, 1304, 1321, 1336, 1357, 1365, 1373, 1384, 1401, 1402, 1406, 1409, 1413, 1417, 1420, 1423, 1424, 1425, 1459, 1468, 1472, 1476, 1478, 1486, 1513, 1514, 1532, 1534, 1562, 1616, 1634, 1658, 1694, 1725, 1748, 1752, 
[cycle=0043, i=77, cols=:41 s=100] SDR: 1277, 1279, 1280, 1283, 1304, 1313, 1321, 1336, 1341, 1360, 1365, 1380, 1384, 1401, 1402, 1406, 1409, 1413, 1417, 1420, 1423, 1459, 1468, 1472, 1476, 1478, 1486, 1513, 1514, 1532, 1562, 1607, 1616, 1634, 1658, 1694, 1725, 1748, 1752, 1767, 1770, 
[cycle=0043, i=78, cols=:41 s=100] SDR: 1312, 1341, 1347, 1360, 1365, 1367, 1380, 1382, 1384, 1387, 1401, 1402, 1405, 1406, 1409, 1413, 1456, 1459, 1466, 1468, 1470, 1471, 1472, 1476, 1484, 1486, 1490, 1502, 1507, 1514, 1562, 1567, 1607, 1634, 1658, 1683, 1694, 1725, 1735, 1736, 1748, 
[cycle=0043, i=79, cols=:41 s=100] SDR: 1321, 1336, 1341, 1347, 1360, 1365, 1367, 1380, 1384, 1387, 1388, 1401, 1406, 1413, 1428, 1439, 1452, 1466, 1471, 1472, 1476, 1482, 1484, 1485, 1486, 1516, 1571, 1580, 1588, 1607, 1626, 1634, 1646, 1658, 1683, 1725, 1736, 1767, 1771, 1773, 1795, 
[cycle=0043, i=80, cols=:41 s=100] SDR: 1347, 1359, 1360, 1388, 1395, 1396, 1401, 1406, 1411, 1413, 1414, 1420, 1425, 1439, 1466, 1470, 1471, 1476, 1482, 1486, 1504, 1515, 1516, 1538, 1545, 1546, 1562, 1567, 1571, 1580, 1612, 1626, 1634, 1679, 1680, 1683, 1735, 1767, 1771, 1773, 1813, 
[cycle=0043, i=81, cols=:41 s=100] SDR: 1347, 1359, 1360, 1376, 1388, 1395, 1396, 1401, 1406, 1408, 1413, 1414, 1415, 1420, 1425, 1439, 1459, 1464, 1476, 1482, 1484, 1486, 1489, 1504, 1516, 1535, 1567, 1571, 1578, 1579, 1589, 1634, 1663, 1679, 1680, 1683, 1741, 1759, 1773, 1798, 1813, 
[cycle=0043, i=82, cols=:41 s=100] SDR: 1388, 1401, 1406, 1407, 1408, 1413, 1414, 1415, 1420, 1439, 1459, 1464, 1484, 1486, 1489, 1497, 1504, 1516, 1522, 1530, 1535, 1538, 1553, 1571, 1578, 1579, 1589, 1614, 1627, 1634, 1663, 1679, 1680, 1698, 1708, 1759, 1787, 1798, 1813, 1827, 1836, 
[cycle=0043, i=83, cols=:41 s=100] SDR: 1401, 1406, 1413, 1415, 1420, 1438, 1439, 1454, 1475, 1484, 1486, 1489, 1497, 1504, 1515, 1518, 1522, 1526, 1535, 1537, 1538, 1549, 1572, 1578, 1579, 1589, 1597, 1614, 1616, 1618, 1663, 1683, 1705, 1706, 1708, 1787, 1813, 1827, 1838, 1842, 1854, 
[cycle=0043, i=84, cols=:41 s=100] SDR: 1406, 1413, 1415, 1438, 1439, 1454, 1475, 1484, 1489, 1491, 1497, 1517, 1518, 1522, 1526, 1535, 1537, 1538, 1544, 1572, 1578, 1579, 1589, 1592, 1597, 1608, 1614, 1634, 1644, 1648, 1654, 1705, 1706, 1708, 1786, 1813, 1827, 1842, 1875, 1881, 1904, 
[cycle=0043, i=85, cols=:41 s=100] SDR: 1439, 1465, 1475, 1484, 1491, 1498, 1517, 1518, 1530, 1535, 1537, 1538, 1549, 1553, 1555, 1560, 1572, 1578, 1589, 1597, 1608, 1614, 1621, 1622, 1634, 1644, 1648, 1654, 1665, 1705, 1706, 1709, 1756, 1769, 1786, 1813, 1827, 1842, 1857, 1904, 1915, 
[cycle=0043, i=86, cols=:41 s=100] SDR: 1472, 1478, 1484, 1491, 1498, 1517, 1522, 1530, 1535, 1537, 1538, 1545, 1552, 1560, 1561, 1566, 1569, 1572, 1577, 1589, 1597, 1622, 1644, 1648, 1654, 1656, 1665, 1686, 1705, 1709, 1717, 1723, 1726, 1756, 1805, 1829, 1842, 1854, 1857, 1881, 1946, 
[cycle=0043, i=87, cols=:41 s=100] SDR: 1491, 1498, 1517, 1518, 1522, 1530, 1535, 1537, 1560, 1561, 1569, 1577, 1586, 1589, 1592, 1597, 1614, 1644, 1648, 1654, 1660, 1665, 1686, 1690, 1695, 1698, 1705, 1709, 1717, 1723, 1726, 1741, 1744, 1753, 1756, 1805, 1813, 1881, 1933, 1946, 1956, 
[cycle=0043, i=88, cols=:41 s=100] SDR: 1491, 1505, 1517, 1531, 1535, 1552, 1560, 1569, 1586, 1589, 1592, 1597, 1608, 1609, 1611, 1614, 1619, 1630, 1632, 1639, 1644, 1648, 1654, 1656, 1660, 1665, 1675, 1695, 1705, 1717, 1744, 1753, 1756, 1789, 1805, 1815, 1829, 1892, 1933, 1956, 1959, 
[cycle=0043, i=89, cols=:41 s=100] SDR: 1531, 1535, 1539, 1550, 1560, 1562, 1570, 1589, 1614, 1630, 1632, 1636, 1644, 1648, 1654, 1665, 1672, 1674, 1675, 1686, 1692, 1695, 1705, 1709, 1717, 1744, 1753, 1756, 1779, 1784, 1789, 1797, 1803, 1805, 1811, 1815, 1822, 1865, 1919, 1956, 1959, 
[cycle=0043, i=90, cols=:41 s=100] SDR: 5, 1550, 1560, 1562, 1599, 1604, 1614, 1623, 1630, 1636, 1640, 1648, 1651, 1654, 1672, 1674, 1675, 1683, 1695, 1709, 1717, 1720, 1744, 1755, 1756, 1772, 1779, 1784, 1797, 1803, 1805, 1815, 1819, 1831, 1853, 1865, 1919, 1949, 1959, 1962, 2005, 
[cycle=0043, i=91, cols=:41 s=100] SDR: 5, 8, 20, 1550, 1560, 1576, 1577, 1599, 1604, 1611, 1623, 1630, 1634, 1636, 1648, 1651, 1654, 1672, 1674, 1686, 1695, 1698, 1709, 1717, 1728, 1744, 1746, 1755, 1756, 1772, 1784, 1805, 1819, 1831, 1853, 1865, 1919, 1949, 1956, 1959, 2045, 
[cycle=0043, i=92, cols=:41 s=100] SDR: 4, 5, 8, 20, 1577, 1599, 1611, 1619, 1636, 1648, 1656, 1661, 1673, 1674, 1686, 1695, 1698, 1715, 1744, 1755, 1771, 1772, 1779, 1784, 1797, 1805, 1819, 1820, 1831, 1850, 1853, 1863, 1864, 1865, 1872, 1881, 1884, 1919, 1949, 1959, 2040, 
[cycle=0043, i=93, cols=:41 s=100] SDR: 0, 5, 8, 10, 20, 38, 1602, 1611, 1657, 1661, 1672, 1673, 1680, 1693, 1695, 1698, 1711, 1713, 1715, 1728, 1744, 1755, 1784, 1793, 1797, 1802, 1818, 1820, 1824, 1831, 1837, 1853, 1865, 1872, 1881, 1919, 1937, 1945, 1949, 1956, 2045, 
[cycle=0043, i=94, cols=:41 s=100] SDR: 0, 8, 20, 36, 38, 44, 47, 53, 76, 1626, 1634, 1642, 1651, 1657, 1661, 1668, 1670, 1673, 1674, 1680, 1693, 1695, 1701, 1728, 1743, 1744, 1772, 1784, 1791, 1794, 1802, 1809, 1824, 1831, 1841, 1853, 1864, 1865, 1919, 1948, 2009, 
[cycle=0043, i=95, cols=:41 s=100] SDR: 0, 8, 20, 36, 38, 47, 53, 56, 59, 75, 76, 81, 1626, 1641, 1642, 1657, 1661, 1670, 1673, 1674, 1680, 1695, 1701, 1728, 1743, 1744, 1758, 1759, 1772, 1780, 1784, 1795, 1802, 1824, 1853, 1874, 1919, 1937, 1948, 1997, 2025, 
[cycle=0043, i=96, cols=:41 s=100] SDR: 8, 14, 18, 36, 38, 47, 53, 56, 59, 75, 76, 81, 88, 92, 95, 105, 1661, 1668, 1670, 1674, 1695, 1701, 1707, 1722, 1732, 1758, 1795, 1802, 1824, 1853, 1874, 1879, 1889, 1919, 1925, 1937, 1948, 1997, 2009, 2025, 2045, 
[cycle=0043, i=97, cols=:41 s=100] SDR: 7, 8, 14, 17, 18, 36, 38, 47, 53, 56, 59, 75, 76, 81, 85, 88, 92, 95, 105, 1670, 1674, 1695, 1701, 1707, 1722, 1744, 1758, 1770, 1771, 1780, 1802, 1824, 1853, 1869, 1874, 1889, 1919, 1948, 1997, 2009, 2025, 
[cycle=0043, i=98, cols=:41 s=100] SDR: 7, 36, 47, 53, 56, 59, 75, 76, 78, 81, 85, 88, 89, 92, 95, 105, 107, 121, 1697, 1701, 1722, 1723, 1734, 1736, 1758, 1764, 1770, 1780, 1784, 1785, 1793, 1802, 1853, 1869, 1948, 1951, 1952, 1984, 1997, 2009, 2025, 
[cycle=0043, i=99, cols=:41 s=100] SDR: 14, 36, 47, 53, 56, 57, 70, 75, 76, 78, 81, 85, 88, 89, 92, 95, 105, 108, 121, 1701, 1722, 1731, 1758, 1762, 1764, 1770, 1772, 1780, 1782, 1783, 1790, 1793, 1795, 1802, 1853, 1862, 1869, 1937, 1952, 1984, 1997, 

This issue happens in the LocalInhibition algorithm in the following method:

public virtual void BoostColsWithLowOverlap(Connections c)
      {
          // Get columns with too low overlap.
          var weakColumns = c.Memory.Get1DIndexes().Where(i => c.HtmConfig.OverlapDutyCycles[i] < c.HtmConfig.MinOverlapDutyCycles[i]).ToArray();

          for (int i = 0; i < weakColumns.Length; i++)
          {
              Column col = c.GetColumn(weakColumns[i]);

              Pool pool = col.ProximalDendrite.RFPool;
              double[] perm = pool.GetSparsePermanences();
              ArrayUtils.RaiseValuesBy(c.HtmConfig.SynPermBelowStimulusInc, perm);
              int[] indexes = pool.GetSparsePotential();

              col.UpdatePermanencesForColumnSparse(c.HtmConfig, perm, indexes, true);
              //UpdatePermanencesForColumnSparse(c, perm, col, indexes, true);
          }
      }

ML22/23-7 Implement Unit Tests for Adapt Segments

After researching existing code for the project, our team is facing an issue finding the nuget packages, we have forked the neocortexapi and cloned, but now the nuget folder is empty and of which we are having an issue with. Also, there is a missing NeocortexEntities.dll file in NeocortexEntities folder.

Implementation Examples for SDR Representation

Hello Professor,
We are currently working on Implementation Examples for SDR Representation. As mentioned in description we started updating existing MD file and working on it. As of now we represented data as a bitmap and trying to perform further steps. Can you please elaborate more what exactly you expect from our side in further steps.

Here is the link of our repo: https://github.com/Pranil-Ghadi/neocortexapi_Team_Bug/tree/master/source/Samples/WorkingwithSDR/WorkingWithSDR

Project Review - Investigation of the HTM Layer stability

Hello Professor,

The deadline for my individual project was on 17.09.2021. I have already done my submission via E-mail but for your reference , below you will find all the work related to this project.

  1. GitHub link to my Source Code >> Here
  2. GitHub link to my Statistical analysis and Results >> Here
  3. GitHub link to a short documentation (MD file) on the project >> Here
  4. GitHub link to my final report >> Here

Kindly review the project and provide me with your valuable feedback.

Regards,
Sahana

review in progress work

#169

Hi Professor

I have created a pull request #169 for your review , please have a look and let me know If i am on the right direction.

VideoLearning Project Review

Dear Mr.Dobric,
Please set me a call this or next week.
I am working on project Video Learning under branch SequenceLearning_ToanTruong

Overview:
The Project used Temporal memory to learn the frame of each video sequentially. Then try to predict the video/ frame sequence when a frame is entered by the user. The output can be seen as a video.

Each frame during the training has its own unique framekey in format label_videoName_index e.g. circle_vd1_1

There are 2 test: Run1 and Run2
Same:
The training set are labeled folder which contains videos.
They both start with reading the video and learning with HomeostaticPlasticityController until Stable state is reached
Test picture (from Paint) used after the learning
rth

Diff:
Run1:
Learning with SP+TM
HTMClassifier learns a framekey corresponding to the Active Cells computed from InputBitArray of a frame.

The test ends after 40 consecutive same accuracy read >90% for the whole training set or reach max Cycles
best accuracy 90%, varies from 54% to 90% sometimes it stucks at one accuracy and must be run again or wait for max Cycle.

The user can then drag a picture into the Console to start testing after learning

Predict each frame with its own framekey, then use the predicted framekey as input to predict the next framekey untill there are no more Predicted Cells in computed output.

current result:
The Video are shorter than the original because at some frame, there are no Predicted Cells.
Overlapping also happens, A circle frame that has its circle overlap a triangle in another frame in the training set is predicted as a triangle.

Run2:
Learning with SP+TM
Adapt logic from SequenceLearning_DUY
HTMClassifier learns a key created by previous frames, e.g.
the key:
circle_vd1_27-circle_vd1_28-circle_vd1_29-circle_vd1_0-circle_vd1_1
corresponding to the Active Cells computed from InputBitArray of frame circle_vd1_1.

The Test ends after reaching Accuracy 90% or more 100 times
best accuracy 93.333%, varies from 77% to 93% sometimes it stucks at one accuracy and must be run again or wait for max Cycle.

The user can then drag a picture into the Console to start testing after learning.

current result
The output video has same length as training videos.
The output video loops and is bound to training videos, it can only output frame sequence in one video in training set.

RESULT FILE (The videos cannot be displayed on issue):
link google drive

A Library for converting Video into a sequence of InputBitArrays and vice versa using OpenCv was developed for the Test.

Help

Assert.AreEqual(4, mem.HtmConfig.PotentialRadius);

Hello Professor, Could you please explain this command?

 **confirmSPConstruction2
 Duration: 39.8 sec

Message: 
Assert.AreEqual failed. Expected:<6>. Actual:<5>.

Stack Trace: 
SpatialPoolerTest.confirmSPConstruction2() line 202**

System.NullReferenceException

Hello Professor

I was trying to create an instance of column class. The property ConnectedInputBits is initiated as "Column.ConnectedInputBits throws an exception of the type system.NullReferenceException"

image

Can you please suggest to me a solution to overcome the issue?

Regards
Mounika

building nuget package of Image Encoder

1. Package Information

I have add the information for publish in line 7 - 27

<IsPackable>true</IsPackable>

Information appears in nupkg manager as follow:
manage Package

2. Testing

a local publish test was also done via a ConsoleApp project. The Program runs fine with the Library:

runningProgram
working

3. Warning

ImageEncoderLib is having dependency from both NeoCortexApi (for EncoderBase inheritance) and ImageBinarizer(to binarize image)
This lead to a huge size library. For example when testing with ConsoleApp1, the program after build(additional size from Images are omitted) reach a size of 61,4 MB :
SizeProblem

When this ImageEncoder nupkg is imported, it also import NeocortexApi and ImageEncoder. Topologically speaking, the nupkg is like another version of NeoCortexApi that has an extension to work with image
nugetPackage

@ddobric, @sabinbajracharya , @Noath2302
I have described the current state of the package, we can discussed here about it

Not letting to create a new branch and push the branch using github

Group: CodeBreakers
Thoan, I am trying to create a branch remotely and push the branch to the repository to initiate the work. I am trying to first run the old code and then start changing where it is needed. But while pushing it shows 403 error. I have searched for it and it says it can occur because of permission problem or repository ssh issue. But i am trying to push using HTTPS url. Can you please check and ensure if there is any permission required for me?

The error message is showing this:
Pushing MigrationOfVideoLearningProject_CodeBreakers
Error encountered while pushing branch to the remote repository: Git failed with a fatal error.
Git failed with a fatal error.
unable to access 'https://github.com/ddobric/neocortexapi.git/': The requested URL returned error: 403

Failed to push the branch to the remote repository. See the Output window for more details.

Mashnunul Huq

Project Review -> Investigation of Spatial Similarity in Spatial Pooler

Hello Professor @ddobric ,

I hope you are doing well. My project submission is on 19.10.2021. I have already submitted my project via email, I invite you to kindly go through my work and give your valuable feedback.

- GitHub link to Code: https://github.com/JayashreeRegoti/neocortexapi/blob/JayashreeRegoti/NeoCortexApi/NeoCortexApi.Experiments/SpatialSimilarityExperiment.cs

- GitHub link to input images:
https://github.com/JayashreeRegoti/neocortexapi/tree/JayashreeRegoti/NeoCortexApi/NeoCortexApi.Experiments/TestData/SpatialSimilarityExperiment

-GitHub link to steps to perform the experiment (MD file) file: https://github.com/JayashreeRegoti/neocortexapi/blob/JayashreeRegoti/NeoCortexApi/NeoCortexApi.Experiments/README.md

-GitHub link to find input vectors, comparison table for spatial similarity:
https://github.com/JayashreeRegoti/neocortexapi/tree/JayashreeRegoti/NeoCortexApi/NeoCortexApi.Experiments/SpatialSimilarityIndividualProjectJayashreeRegoti/Input%20And%20Output%20Values

-GitHub link for Final comparison table and graphical representation:
https://github.com/JayashreeRegoti/neocortexapi/blob/JayashreeRegoti/NeoCortexApi/NeoCortexApi.Experiments/SpatialSimilarityIndividualProjectJayashreeRegoti/Input%20And%20Output%20Values/potentialIndipendent.xlsx

-GitHub link to find the experiment documentation:
https://github.com/JayashreeRegoti/neocortexapi/blob/JayashreeRegoti/NeoCortexApi/NeoCortexApi.Experiments/SpatialSimilarityIndividualProjectJayashreeRegoti/documentation/Jayashree_Regoti_1324798_IndividualProject.pdf

Kindly review the final submission of my project and let me know if any information is needed.

Thanks and regards,

Jayashree Regoti(1324798) @ddobric

Problem with the .Net Version

Screenshot 2024-01-17 at 17 59 32
The current .NET SDK does not support targeting .NET 8.0. Either target .NET 7.0 or lower, or use a version of the .NET SDK that supports .NET 8.0. (NETSDK1045) (NeocortexApiLLMSample)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.