public NetworkDouble(IEnumerable <IHiddenLayer <Double> > hiddenLayers, IOutputLayer <Double> outputLayer) { this._hiddenLayers = new List <IHiddenLayer <Double> >(); this._hiddenLayers.AddRange(hiddenLayers); this._outputLayer = outputLayer; }
protected IOutputLayer CreateOutputLayer(LayerConfig outputLayerConfig, int sparseFeatureSize, int denseFeatureSize) { IOutputLayer outputLayer = null; switch (outputLayerConfig.LayerType) { case LayerType.SampledSoftmax: Logger.WriteLine("Create sampled softmax layer as output layer"); outputLayer = new SampledSoftmaxLayer(outputLayerConfig as SampledSoftmaxLayerConfig); outputLayer.InitializeWeights(0, denseFeatureSize); break; case LayerType.Softmax: Logger.WriteLine("Create softmax layer as output layer."); outputLayer = new SoftmaxLayer(outputLayerConfig as SoftmaxLayerConfig); outputLayer.InitializeWeights(sparseFeatureSize, denseFeatureSize); break; case LayerType.Simple: Logger.WriteLine("Create simple layer as output layer."); outputLayer = new SimpleLayer(outputLayerConfig as SimpleLayerConfig); outputLayer.InitializeWeights(sparseFeatureSize, denseFeatureSize); break; } outputLayer.LabelShortList = new List <int>(); return(outputLayer); }
public override void CreateNetwork(List <LayerConfig> hiddenLayersConfig, LayerConfig outputLayerConfig, DataSet <T> TrainingSet, Config featurizer) { var forwardHiddenLayers = CreateLayers(hiddenLayersConfig); var backwardHiddenLayers = CreateLayers(hiddenLayersConfig); for (var i = 0; i < hiddenLayersConfig.Count; i++) { ILayer forwardLayer = forwardHiddenLayers[i]; ILayer backwardLayer = backwardHiddenLayers[i]; var denseFeatureSize = TrainingSet.DenseFeatureSize; if (i > 0) { denseFeatureSize = forwardHiddenLayers[i - 1].LayerSize * 2; } forwardLayer.InitializeWeights(TrainingSet.SparseFeatureSize, denseFeatureSize); backwardLayer.InitializeWeights(TrainingSet.SparseFeatureSize, denseFeatureSize); forwardLayer.SetRunningMode(RunningMode.Training); backwardLayer.SetRunningMode(RunningMode.Training); Logger.WriteLine($"Create hidden layer {i}: size = {forwardLayer.LayerSize}, sparse feature size = {forwardLayer.SparseFeatureSize}, dense feature size = {forwardLayer.DenseFeatureSize}"); } outputLayerConfig.LayerSize = TrainingSet.TagSize; IOutputLayer outputLayer = CreateOutputLayer(outputLayerConfig, TrainingSet.SparseFeatureSize, forwardHiddenLayers[forwardHiddenLayers.Count - 1].LayerSize * 2); outputLayer.SetRunningMode(RunningMode.Training); Logger.WriteLine($"Create a bi-directional recurrent neural network with {forwardHiddenLayers.Count} hidden layers. Forward and backward layers are concatnated."); InitCache(forwardHiddenLayers, backwardHiddenLayers, outputLayer); }
public void OutputLayer_WithSortOrder_InitializesProperty() { // Arrange _layer = new OutputLayer(5); // Assert Assert.IsTrue(_layer.SortOrder == 5); }
public void Initialize() { _network = new DFFNeuralNetwork(_inputLayerNeuronCount, _hiddenLayersCount, _hiddenLayerNeuronCount, _outputLayerNeuronCount); _inputLayer = _network.Layers.OfType <IInputLayer>().First(); _outputLayer = _network.Layers.OfType <IOutputLayer>().First(); _hiddenLayer = _network.Layers.OfType <IHiddenLayer>().First(); _trainingIterations = new List <INetworkTrainingIteration>(); }
public NetworkTopology() { hiddenLayers = null; inputLayer = null; outputLayer = null; preProcessor = null; postProcessor = null; TrainingPreProcessor = null; TrainingAlgorithm = null; }
public NetworkTopology() { hiddenLayers = null; inputLayer = null; outputLayer = null; preProcessor = null; postProcessor = null; TrainingPreProcessor = null; TrainingAlgorithm = null; }
public void OutputLayer_WithSortOrderAndNeurons_InitializesProperty() { // Arrange var outputNeuron = new OutputNeuron(); _layer = new OutputLayer(5, new List <IOutputNeuron>() { outputNeuron }); // Assert Assert.IsTrue(_layer.Neurons.Count() == 1); Assert.IsTrue(_layer.Neurons.First() == outputNeuron); }
protected void InitCache(List <ILayer> s_forwardRNN, List <ILayer> s_backwardRNN, IOutputLayer outputLayer) { forwardHiddenLayers = s_forwardRNN; backwardHiddenLayers = s_backwardRNN; //Initialize output layer OutputLayer = outputLayer; forwardCellList = new List <Neuron[]>(); backwardCellList = new List <Neuron[]>(); for (int i = 0; i < numOfLayers; i++) { var forwardCells = new Neuron[MaxSeqLength]; var backwardCells = new Neuron[MaxSeqLength]; for (int j = 0; j < MaxSeqLength; j++) { if (forwardHiddenLayers[i] is DropoutLayer) { forwardCells[j] = new DropoutNeuron(forwardHiddenLayers[i].LayerSize); backwardCells[j] = new DropoutNeuron(forwardHiddenLayers[i].LayerSize); } else if (forwardHiddenLayers[i] is LSTMLayer) { forwardCells[j] = new LSTMNeuron(forwardHiddenLayers[i].LayerSize); backwardCells[j] = new LSTMNeuron(forwardHiddenLayers[i].LayerSize); } else { forwardCells[j] = new Neuron(forwardHiddenLayers[i].LayerSize); backwardCells[j] = new Neuron(forwardHiddenLayers[i].LayerSize); } } forwardCellList.Add(forwardCells); backwardCellList.Add(backwardCells); } OutputCells = new Neuron[MaxSeqLength]; for (var i = 0; i < MaxSeqLength; i++) { OutputCells[i] = new Neuron(OutputLayer.LayerSize); } InitLayersOutputCache(); }
public void Initialize() { _layer = new OutputLayer(1); }