Skip to content
This repository was archived by the owner on Aug 6, 2021. It is now read-only.
/ malbec Public archive

"Machine Learning Backpropagation Execution" is a simple machine learning library for JS

License

Notifications You must be signed in to change notification settings

freire-guido/malbec

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

26 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

malbec

NOTE: This is just a practice project, usability and user-friendliness are not a priority right now.

'malbec' is a JavaScript package intended to make the execution of forward and back-propagation algorithms easier for my machine learning projects. It is also great as a procastination tool for when I don't feel like studying or doing homework :P

Table of contents

Structure

malbec has a main class: "NNetwork" which is basically an array of "NLayers" with some syntactic sugar. Every NLayer has weights and biases that are randomly initialized (this and much more can be changed in parameters), which produce an output array through matrix math.

Creating networks

malbec's main functionality is automatically ordering NLayer inputs and outputs to facilitate the NNetwork generation process.

create([size, activation], [size, activation], [size, activation])

The create command receives each layer as a list, each specifying its amount of neurons (size) and its activation function (activation).

var network = malbec.create([5], [4, 'relu'], [2, 'tanh']);
//Creates a neural net with three layers: 5 inputs, 4 hidden neurons and 2 outputs.

There are currently four activation functions: 'relu', 'tanh', 'sigmoid' (and none).

Forward Propagation

Any NNetwork object can produce an output in one line, using the command:

NNetwork.forward(input)

let input = [Math.random(), Math.random(), Math.random(), Math.random(), Math.random()]
outputs = network.forward(input); //Produces outputs from a list of five random numbers

The size of the input needs to be compatible with the first layer, missing or undefined elements will be treated as zeroes.

Back Propagation

coming soon!

Additional Functionality

Encoding and Decoding

NLayers can be encoded into a 'genome', returning their weights and biases as a readable array. This can be used to produce mutation, crossover and transfer learning.

NLayer.genome()

var genome = network[1].genome() //Stores weights and biases as an array

These genome objects can be read by other NLayers, setting their parameters to those in the genome.

NLayer.genome = genome

network1[0].genome = network2[0].genome(); // Copies network2's first layer into network1's first layer

Crossover

Multiple NNetworks can be "Crossed": their weights and biases spliced at random intervals, returning a new child network. NNetworks should have at minimum, the same layer structure (same amount of neurons/outputs per layer). The child network will follow the first NNetwork's structure.

crossover(...networks)

var childNet = malbec.crossOver(network2, network1);

Mutate

Changes weights or biases of one or many NNetwork/s to random values according to the given chance. The chance should be a value between 0 and 1, where 1 represents a 100% probability of the gen mutating and 0 being 0%.

mutate(chance, ...networks)

malbec.mutate(0.5, network1, network2)

About

"Machine Learning Backpropagation Execution" is a simple machine learning library for JS

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published