Skip to content
forked from lambder/KUnet.jl

Neural network code based on Julia and CUDA.

License

Notifications You must be signed in to change notification settings

jmgore75/KUnet.jl

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

KUnet

KUnet.jl is the beginnings of a deep learning package for Julia with emphasis on conciseness, clarity and easy extensibility. It started as a challenge to see how many lines of (readable) code were sufficient to express deep learning algorithms given the right language. A secondary concern was efficiency: being able to run the same code on GPU with minimal trouble. The latest version is less than 1000 lines of code and supports backprop in feedforward nets with convolution, pooling, and inner product layers with/without bias, relu, tanh, sigmoid activations, softmax and quadratic loss, optimization with sgd, momentum, nesterov, adagrad, dropout, L1-L2 regularization, on both CPU/GPU, with Float32/Float64 arrays of 1-5 dimensions. Its speed is competitive with Caffe (here is a benchmark), and I think recurrent and boltzmann nets can be added without too much effort.

You can send me suggestions for improvement (both in coding style and new functionality) using issues or pull requests on GitHub.

I tried to make the code (cpu/gpu) generic and close to how we think of these algorithms mathematically. Getting the same code working on the GPU and the CPU in Julia proved to be a bit challenging and showed that both a more standard treatment of CPU and GPU arrays, and a standard syntax for in-place operations would be welcome additions to the language. I'd like to thank Tim Holy (CUDArt), Nick Henderson (CUBLAS), and Simon Byrne (InplaceOps) for their generous help.

Related Links

About

Neural network code based on Julia and CUDA.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Julia 51.5%
  • Cuda 23.3%
  • C 7.9%
  • Makefile 7.8%
  • MATLAB 7.7%
  • Python 1.7%
  • Shell 0.1%