Skip to content
/ NET Public
forked from okozelsk/NET

Reservoir computing library for .NET. Enables ESN , LSM and hybrid RNNs using analog and spiking neurons working together.

License

Notifications You must be signed in to change notification settings

thild/NET

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Reservoir Computing for .NET (RCNet)

Reservoir Computing conceptual view
The aim of this project is to make the reservoir computing methods easy to use and available for .net platform without any other dependencies. Two main reservoir computing methods are called Echo State Network (ESN) and Liquid State Machine (LSM). RCNet supports both of these methods. Moreover, since ESN and LSM are based on similar general principles, RCNet allows to design complex "hybrid" recurrent reservoirs consisting of spiking and analog neurons synaptically linked together. Mutual cooperation of the hidden neurons having stateless analog and stateful spiking activation functions is enabled by specific implementation of hidden neuron. Hidden neuron is not stateless and it can fire spikes even in case of stateless analog activation is used. "Analog spikes" are based on defined firing event depending on current and previous values of the stateles activation. Hidden neuron also provides a standardized set of predictors no matter what activation function is used. According to preliminary results, it seems that it is no longer true that ESN is not capable to generalize and separate input signal enaugh to perform excellent classification. On the contrary. It now appears that the use of pure analog activations (like TanH) and the simple classical ESN reservoir design could be an "unexpected" competitor to spiking LSM reservoirs.
The main component of RCNet is called "State Machine" and it has to be instantiated through its settings class. "State Machine" is serializable so it is easily possible to instantiate and train it and than use it as a real-time loadable component in the solution.
Source code is written in C# 7.3 (.NET framework 4.7.2). More detailed documentation will be posted here as soon as the current stage of the wild changes is over.
I welcome questions, ideas and suggestions for improvements, usage experiences, bug alerts, constructive comments, etc... Please use my email address oldrich.kozelsky@email.cz to contact me.

State Machine demo application

Main functionality and possibilities of the State Machine are demonstrated in a simple demo application. Application has no startup parameters and when started, it shows the menu.
Note that if necessary, examples use Examples sub-folder relative to the location of the executable DemoConsoleApp.exe.

Performance demonstration (1. menu choice)

Application performs sequence of defined tasks. Tasks are defined in the SMDemoSettings.xml xml file, where each task is defined in the xml element "case" so you can easily insert new task or tune existing one by simple modification of xml content. SMDemoSettings.xml has to be located in the SM sub-folder relative to the location of the executable DemoConsoleApp.exe.
SMDemoSettings.xml currently also includes several classification problems from the: Anthony Bagnall, Jason Lines, William Vickers and Eamonn Keogh, The UEA & UCR Time Series Classification Repository, www.timeseriesclassification.com site and State Machine usually achieves very similar results to the best classification algorithms referenced on the website.

Classification Results Comparison

Dataset State Machine Accuracy Best Ref. Accuracy Best Ref. Algorithm
CricketX 80.77% 81.4% COTE
Worms 83.12% 73.49% BOSS
BeetleFly 100% 94.85% BOSS
BirdChicken 100% 98.4% BOSS
ProximalPhalanx 88.29% 88.09% ST
Yoga 91.13% 90.99% BOSS

Code examples (2. menu choice)

This very simple machine learning example shows how to learn Feed Forward Network component to solve boolean algebra. Feed Forward network is a part of the State Machine's readout layer, but here is shown that it can be also used as a stand alone component.

Code examples (3. menu choice)

Example shows how to manually setup State Machine configuration from the scratch, then how to train State Machine and how to verify its performance.

Code examples (4, ... menu choices)

Several examples show usage of the State Machine Designer component to setup simple State Machine configuration, then how to train State Machine and how to verify its performance.

Data format for the demo application

Input data is standardly located in the Data sub-folder relative to the location of the executable DemoConsoleApp.exe. Data is expected in csv format and data delimiter can be a tab, semicolon or comma character.

  • Continuous feeding regime requires a standard csv format, where the first line contains the names of the data fields and each next line contains the data. Here is an example
  • Patterned feeding regime requires specific logical csv format without colum names (header). Each data line contains values of steady (optional) and repetitive pattern features followed by expected output values at the end. Values of repetitive pattern features can be organized in two ways: groupped [v1(t1),v2(t1),v1(t2),v2(t2),v1(t3),v2(t3)] or sequential [v1(t1),v1(t2),v1(t3),v2(t1),v2(t2),v2(t3)]. Here is an example

Components overview

Reservoir Computing conceptual view
(listed in logical order from basic to composite and complex)

Math

Component Description
BasicStat Provides basic statistics of given data (averages, sum of squares, standard deviation, etc.)
WeightedAvg Computes weighted average of given value/weight data pairs
MovingDataWindow Implements moving data window and offers computation of weighted average of recent part of given data
ODENumSolver Implements ordinary differential equations (ODE) numerical solver supporting Euler and RK4 methods
Vector Implements vector of double values supporting basic mathematical operations
Matrix Implements matrix of double values supporting basic mathematical operations. Contains buit-in Power Iteration method for the largest eigen value quick estimation
EVD Full eigen values and vectors decomposition of a squared matrix
SVD Singular values decomposition of a matrix
QRD QR decomposition of a matrix
LUD LU decomposition of a squared matrix
ParamSeeker Implements an error driven iterative search for the best value of a given parameter
HurstExpEstim Implements Hurst exponent estimator and Rescalled range. It can be used to evaluate level of data randomness
"RandomValue" Supports Uniform, Gaussian, Exponential and Gamma distributions. Here is extension code
Others Set of small additional helper components like PhysUnit, Interval, Bitwise, Combinatorics, Discrete,...

XML handling

Component Description
DocValidator Helper class for xml document loading and validation

Data generators

Component Description
PulseGenerator Generates constant pulses having specified average period. Pulse leaks follow specified random distribution or can be constant
MackeyGlassGenerator Generates Mackey-Glass chaotic signal
RandomGenerator Generates random signal following specified distribution
SinusoidalGenerator Generates sinusoidal signal

Data Filtering

Component Description
BinFeatureFilter Binary (0/1) feature filter
EnumFeatureFilter Enumeration (1..N) feature filter
RealFeatureFilter Real number feature filter supporting standardization and range reserve for handling of unseen data in the future

Chainable Data Transformations

Component Description
CDivTransformer Provides "constant divided by an input field value" transformation
DiffTransformer Transforms input field value as a difference between current value and a past value
DivTransformer Divides the value of the first input field by the value of the second input field
ExpTransformer Specified base powered by an input field value
LinearTransformer Two input fields linear transformation (aX + bY)
LogTransformer Transforms input field value to its logarithm of specified base
MulTransformer Multiplies the value of the first input field by the value of the second input field
MWStatTransformer Keeps stat of input field recent values and provides statistical features as a transformed values (Sum, NegSum, PosSum, SumOfSquares, Min, Max, Mid, Span, ArithAvg, MeanSquare, RootMeanSquare, Variance, StdDev, SpanDev)
PowerTransformer Transforms input field value to value^exponent
YeoJohnsonTransformer Applies Yeo-Johnson transformation to input field value. See the wiki pages.

Data holding

Component Description
SimpleQueue Implements quick and simple FIFO queue (template). Supports access to enqueued elements so it can be also used as the "sliding window"
DelimitedStringValues Helper encoder and decoder of data line in csv format
CsvDataHolder Provides simple loading and saving of csv data
VectorBundle Bundle of input data vectors and corresponding desired output vectors (1:1). Supports upload from csv file
InputPattern Input pattern supporting signal detection, unification and resampling features
ResultBundle Bundle of input, computed and desired output vectors (1:1:1)

Analog activation functions (stateless)

See the wiki pages.

Component Description
BentIdentity Bent identity activation function
SQNL Square nonlinearity activation function
Elliot Elliot activation function (aka Softsign)
Gaussian Gaussian activation function
Identity Identity activation function (aka Linear)
ISRU ISRU (Inverse Square Root Unit) activation function
LeakyReLU Leaky ReLU (Leaky Rectified Linear Unit) activation function
Sigmoid Sigmoid activation function
Sinc Sinc activation function
Sinusoid Sinusoid activation function
SoftExponential Soft exponential activation function
SoftPlus Soft Plus activation function
TanH TanH activation function

Spiking activation functions (stateful)

See the wiki pages.

Component Description
SimpleIF Simple Integrate and Fire activation function
LeakyIF Leaky Integrate and Fire activation function
ExpIF Exponential Integrate and Fire activation function
AdExpIF Adaptive Exponential Integrate and Fire activation function
IzhikevichIF Izhikevich Integrate and Fire activation function (model "one fits all")

Non-recurrent networks and trainers

Component Description
FeedForwardNetwork Implements the feed forward network supporting multiple hidden layers
RPropTrainer Resilient propagation (iRPROP+) trainer of the feed forward network
QRDRegrTrainer Implements the linear regression (QR decomposition) trainer of the feed forward network. This is the special case trainer for FF network having no hidden layers and Identity output activation function
RidgeRegrTrainer Implements the ridge linear regression trainer of the feed forward network. This is the special case trainer for FF network having no hidden layers and Identity output activation function
ElasticRegrTrainer Implements the elastic net trainer of the feed forward network. This is the special case trainer for FF network having no hidden layers and Identity output activation function
ParallelPerceptron Implements the parallel perceptron network
PDeltaRuleTrainer P-Delta rule trainer of the parallel perceptron network
TrainedNetwork Encapsulates trained non-recurrent (Feed forward or Parallel perceptron) network and related error statistics.
TrainedNetworkBuilder Builds single trained (Feed forward or Parallel perceptron) network. Performs training epochs and offers control to user to evaluate the network.
TrainedNetworkCluster Encapsulates set of trained non-recurrent networks (cluster of TrainedNetwork instances) and related error statistics. Offers weighted cluster prediction and also publics all inner members sub-predictions.
TrainedNetworkClusterBuilder Builds cluster of trained networks based on x-fold cross validation approach. Each fold can have associated number of various networks.

State Machine components

Component Description
Synapse Computes dynamically weighted signal from source to target neuron. It supports short-term plasticity and signal delay.
InputEncoder Encodes external input for the processing in the reservoirs. Supports set of various realtime input chainable transformations as additional computed input fields. Provides analog and spiking input neurons.
AnalogInputNeuron Input neuron providing analog signal.
SpikingInputNeuron Input neuron providing spiking signal.
HiddenNeuron Supports both analog and spiking activation functions and can produce analog signal and/or spikes (neuron is able to fire spikes even when stateless analog activation is used). Supports Retainment property of analog activation (leaky integrator). Supports set of different predictors.
ReservoirInstance Provides recurrent network supporting analog and spiking neurons working directly together. Supports SpectralRadius (for weights of analog neurons), Homogenous excitability of spiking neurons, Multiple 3D pools of neurons, Pool to pool connections. It can work as the Echo State Network reservoir, Liquid State Machine reservoir or Mixed reservoir
NeuralPreprocessor Provides data preprocessing to predictors. Supports multiple internal reservoirs. Supports virtual input data associated with predefined signal generators and transformers. Supports two input feeding regimes: Continuous and Patterned
ReadoutUnit Readout unit does the Forecast or Classification and encapsulates TrainedNetworkCluster.
ReadoutLayer Implements independent readout layer consisting of trained readout units.

State Machine

The main component StateMachine encapsulates independent NeuralPreprocessor and ReadoutLayer components into the single component and adds support for routing specific predictors and input fields to the specific readout units. Allows to bypass NeuralPreprocessor and to use input data directly as a predictors for the readout layer.

Setup

Each component that makes up StateMachine (including StateMachine itself) has its own related settings class providing configuration, which is required by the component's constructor.
Each settings class can be instantiated manually from scratch or from a xml element encapsulating all parameters. RCNetTypes.xsd defines all xml elements used in settings classes constructors.
Each settings class also implements the GetXml method so it can be instantiated from scratch and the initialization xml element can be exported by calling the GetXml method (and stored for later use). Using xml constructors is generally preferable because the initialization xml can be edited without the need to modify source code of the manual setup.
To make things easier, RCNet also implements helper component StateMachineDesigner for easier setup of simple ESN and LSM StateMachine configurations from the code (see examples in demo application).

About

Reservoir computing library for .NET. Enables ESN , LSM and hybrid RNNs using analog and spiking neurons working together.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • C# 100.0%