A Distributed Deep Learning framework running on Apache Spark

What is dllib?

dllib is a simple distributed deep learning framework running on Apache Spark. It aims to enable user or developer to run deep learning algorithms easily on Apache Spark.

Designed for developers

dllib is designed to make implementing new optimizer or network layers easy for developers. It is kept simple and easy to read.

Time saver

You can use dllib with only one command line if you have already Spark cluster. The package can be installed via Spark Packages

Easy to customise

All codebase written in Scala programming language. So you can extend the algorithms with JVM languages.

Get Started

Spark Packages

Installing via Spark Packages

        $ ./bin/spark-shell --packages Lewuathe:dllib:0.0.8


dllib is distributed under Apache License v2. See more detail in LICENSE


dllib will help you to understand deep learning algorithms on Spark.
Feel free to get in touch if you have any questions or suggestions.

I'm software engineer focused on distributed system like Hadoop/Spark

I am a contributor to Apache Hadoop/Spark project. And I am interested in deep learning. We can and should combine these two field for researching and developing new algorithms on huge realistic dataset. So that's the reason why I developed dllib.

Kai Sasaki
Software Engineer

Get Connected