Bright Wire is a machine learning library for .NET with GPU support (via CUDA).
Bright Wire runs "out of the box" for CPU based computation on .Net 4.6 and above. For GPU based computation, you will need to install NVIDIA CUDA Toolkit 7.5 (and have a Kepler or better NVIDIA GPU).
To enable higher performance CPU based computation, Bright Wire also supports the Intel Math Kernel Library (MKL) via the Numerics.Net Wrapper.
- Getting Started
- Classification Overview
- Building a Language Model
- Recognising Handwritten Digits (MNIST)
- Sentiment Analysis
- Text Clustering
- Recurrent Neural Networks
To install the standard version (no CUDA support, any CPU) use:
Install-Package BrightWire.Net4
To add CUDA support (x64 only) use:
Install-Package BrightWire.CUDA.Net4.x64
Note: When using the CUDA version, make sure that the /LinearAlgebra/cuda/kernel.ptx file is copied to the output directory (Properties/Copy To Output Directory).
It's highly likely that your GPU supports different CUDA capabilities than the precompiled kernel.ptx
in this repository. You can find what is your capability level here. It's a number, ex. 3.0, 3.5, that you use for specifying compute_XX
and sm_XX
parameters.
If you get an ErrorNoBinaryForGPU
exception, that means you have to recompile. The instructions are here.
Example command for NVIDIA GeForce GTX770M (CUDA 3.0)
nvcc kernel.cu -use_fast_math -ptx -m 64 -arch compute_30 -code sm_30 -o kernel.ptx
You can use Bright Wire on Mono (tested with 4.6.2/Fedora 25) out of the box, no additional setting up is needed.
Bright Wire can also work with CUDA on Mono. When you build your solution, you will need to extract ConfigForLinux.zip
archive from here to your output path.
That way, CUDA won't look for nvcuda
on Linux, but for libcuda shared object. You can even run on your Optimus enabled laptop (tested with GTX770M with Bumblebee) with optirun mono [binary_name]
.
Another issue you may have is that protobuf
library complains that it is already referencing NETStandard
library. NuGet version is a bit older on Mono, so please try with the latest NuGet binary from their website. That way, all the libraries get pulled correctly.
- Feed Forward, Convolutional, Recurrent and Bidirectional network architectures
- Minibatch training
- L2, Dropout and DropConnect regularisation
- RELU, LeakyRelu, Sigmoid and Tanh activation functions
- Gaussian, Xavier and Identity weight initialisation
- Cross Entropy, Quadratic and RMSE cost functions
- Momentum, NesterovMomentum, Adagrad, RMSprop and Adam gradient descent optimisations
- Naive Bayes
- Multinomial Bayes
- Multivariate Bernoulli
- Markov Models
- K Means clustering
- Hierachical clustering
- Non Negative Matrix Factorisation
- Random Projection
- Regression
- Logistic Regression
- Multinomial Logistic Regression
- Decision Trees
- Random Forest
- Stacking
- K Nearest Neighbour classification
- In-memory and file based data processing
- ManagedCuda (optional)
- MathNet.Numerics
- Protobuf-net