Counting 3,463 Big Data & Machine Learning Frameworks, Toolsets, and Examples...
Suggestion? Feedback? Tweet @stkim1

Last Commit
Jan. 22, 2019
Apr. 2, 2018


April 2 2018

GPU Javascript Library for Machine Learning

The creation of this library comes after investigation and work about how avoid out to CPU when perform GPU neural network infference/training along every neural network layers.

I achieved a solution to this for using an Adjacency Matrix which allow know the relation easily on GPU. It's more used in real-time Graph systems GPU based but I can't saw any library using it over GPU neural networks.

Exist a limitation of 4096 neurons by the fact of use an Adjacency Matrix but through this system the leak to CPU is avoided on the middle layers and the CPU infference read is only performed in the last layer (out layer). On entire WebGL applications that may need a neural network system this technique would allow entire neural network execution over GPU.

At the same time batch is executed in the same way, allowing inyect 7 direct experiences at the same time per tick for training or 7 direct infferences minimum at the same time per tick and WebGL context

Each single network can handle until 7 states (training or test) but if you want more than 7 agents only set the agentsCount option and it will make a network cloning and up it in multiples of 7 (at this moment only in regression). When training agentsCount must be not greater than 7.

The Reinforcement Learning class and Plot system belong to Karpathy's RL module (ConvNetJS).

- Basic Linear Regression

- ConvNetJS Reinforcement Learning demo integration - (and WebASM version)

- Reinforcement Learning + convolution

- MNIST Classification

How it works

We not need element-wise matrixs and send information to CPU on every layer result.

On backpropagation the weight data is updated over the Adjacency Matrix

Activation function is included inside own neuron, avoiding to have propagate it to any reluctance layer and so we gain better performance.

At this moment only leaky-relu activation function is implemented.


npm install gbrain
<script src="../../dist/gbrain/Graph.class.js"></script>
<script src="../../dist/gbrain/KERNEL_DIR.class.js"></script>
<script src="../../dist/gbrain/KERNEL_ADJMATRIX_UPDATE.class.js"></script>
<script src="../../dist/gbrain/VFP_NODE.class.js"></script>
<script src="../../dist/gbrain/VFP_NODEPICKDRAG.class.js"></script>
<script src="../../dist/gbrain/ProccessImg.class.js"></script>
<script src="../../dist/gbrain/gbrain.js"></script>
<script src="../../dist/gbrain/gbrain-rl.js"></script>
<script src="../../dist/gbrain/gbrain.min.js"></script>


I have been able to learn about this algorithm especially by the Andrew NG Machine Learning course, Karpathy's ConvNetJS, the Matt Mazur paper, Prakash Jay tutorial, Miguel Ángel Lobato & users that shared information on internet. Thanks.

Latest Releases
 Nov. 14 2018
 Oct. 20 2018
 Oct. 17 2018
 Oct. 14 2018
 Oct. 13 2018