Carin Meier / Jan 31 2019
Remix of Clojure by Nextjournal

Clojure MXNet Introduction

Remix this to get started with Clojure MXNet

{:deps
 {org.clojure/clojure {:mvn/version "1.10.0-beta8"}
  org.apache.mxnet.contrib.clojure/clojure-mxnet-linux-cpu  {:mvn/version "1.3.1"}
  org.clojure/tools.deps.alpha
  {:git/url "https://github.com/clojure/tools.deps.alpha.git"
   :sha "f6c080bd0049211021ea59e516d1785b08302515"}}}
deps.edn
Extensible Data Notation

Apache MXNet (incubating) is a flexible and efficient deep learning library. It also has a Clojure API that we can use.

We are going to be using the pre-built jar for MXNet. This means we need to load in a few dependencies. If you are running locally on your system, please check the Readme for the correct ones for your system setup.

sudo apt-get update || true
sudo apt-get install -y build-essential
sudo apt-get install -y software-properties-common
sudo apt-get install -y libatlas-base-dev 
sudo apt-get install -y libopenblas-dev
sudo apt-get install -y libcurl3
sudo add-apt-repository ppa:timsc/opencv-3.4
sudo apt-get update
sudo apt-get install -y libopencv-imgcodecs3.4 

1.
NDArray

The NDArray API contains tensor operations

(require '[org.apache.clojure-mxnet.ndarray :as ndarray])

You can create new NDArrays with a few different functions. You can initialize a NDArray with all zeros and with a given shape vector:

(ndarray/zeros [3 3])
4[org.apache.mxnet.NDArray,"0x19211295","org.apache.mxnet.NDArray@5085876b",{1}]

You can see the contents of the NDArray with ->vec

(-> (ndarray/zeros [3 3]) 
    (ndarray/->vec))
9[0,0,0,0,0,0,0,0,0]

You can also see the shape of any NDArray with shape-vec

(-> (ndarray/zeros [3 3]) 
    (ndarray/shape-vec))
2[3,3]

You can create a NDArray with ones instead of zeros

(-> (ndarray/ones [3 3]) 
    (ndarray/->vec))
9[1,1,1,1,1,1,1,1,1]

You can also specify your own values with array

(-> (ndarray/array [1 2 3 4 5 6] [2 3]) ;;; shape is 2x3
    (ndarray/->vec))
6[1,2,3,4,5,6]

You can also do operations on NDArrays like addition

(let [a (ndarray/ones [1 5])
      b (ndarray/ones [1 5])]
  (-> (ndarray/+ a b)
      (ndarray/->vec)))
5[2,2,2,2,2]

Or a Dot Product

(let [array1 (ndarray/array [1 2] [1 2])
      array2 (ndarray/array [3 4] [2 1])]
  (ndarray/->vec (ndarray/dot array1 array2)))
1[11]

There are many more operations. Please check the API Docs for the rest

2.
Symbolic API

The Symbolic API provides a way to configure computation graphs. You can define these at either the level of a neural network layer or as a fine-grained operation.

(require '[org.apache.clojure-mxnet.symbol :as sym])

A Neural network might look something like this:

(as-> (sym/variable "data") data
      (sym/fully-connected "fc1" {:data data :num-hidden 128})
      (sym/activation "act1" {:data data :act-type "relu"})
      (sym/fully-connected "fc2" {:data data :num-hidden 64})
      (sym/softmax-output "out" {:data data}))
4[org.apache.mxnet.Symbol,"0x9074cdb","org.apache.mxnet.Symbol@9074cdb",{1}]

Or you can do your own computation graph to add two inputs together:

(let [a (sym/variable "a")
      b (sym/variable "b")]
  (sym/+ a b))
4[org.apache.mxnet.Symbol,"0x41684de4","org.apache.mxnet.Symbol@41684de4",{1}]

To execute symbols, first we need to define the data that they should run on. We can do it by using the bind method, which accepts device context and a dict mapping free variable names to NDArrays as arguments and returns an executor. The executor provides forward method for evaluation and an attribute outputs to get all the results.

(require '[org.apache.clojure-mxnet.executor :as executor])
(let [a (sym/variable "a")
      b (sym/variable "b")
      c (sym/+ a b)
      ex (sym/bind c {"a" (ndarray/ones [2 2])
                      "b" (ndarray/ones [2 2])})]
  (-> (executor/forward ex)
      (executor/outputs)
      (first)
      (ndarray/->vec)))
4[2,2,2,2]

3.
Module API

The Module API provides an intermediate and a high-level interface for performing computation with neural networks in MXNet. To showcase this we can perform the classic MNIST training.

First we need to download the MNIST training data set:

wget http://data.mxnet.io/mxnet/data/mnist.zip
unzip mnist.zip

Now we can take this training data and load it with the help of the MXNet IO package.

(require '[org.apache.clojure-mxnet.io :as mx-io])

First the training data, which consists of handwritten numbers that look something like:


(def train-data (mx-io/mnist-iter {:image "train-images-idx3-ubyte"
                                   :label "train-labels-idx1-ubyte"
                                   :label-name "softmax_label"
                                   :input-shape [784]
                                   :batch-size 10
                                   :shuffle true
                                   :flat true
                                   :silent false
                                   :seed 10}))
user/train-data

We also add the test data. After the model has been trained, it will be tested on images that it hasn't seen before and scored on it.

(def test-data (mx-io/mnist-iter {:image "t10k-images-idx3-ubyte"
                                  :label "t10k-labels-idx1-ubyte"
                                  :input-shape [784]
                                  :batch-size 10
                                  :flat true
                                  :silent false}))
user/test-data

We need to require the module API

(require '[org.apache.clojure-mxnet.module :as m])

Then, we will define our model with a network layer of symbols at call fit, which is the high level training function with our data.

(defn get-symbol []
  (as-> (sym/variable "data") data
    (sym/fully-connected "fc1" {:data data :num-hidden 128})
    (sym/activation "relu1" {:data data :act-type "relu"})
    (sym/fully-connected "fc2" {:data data :num-hidden 64})
    (sym/activation "relu2" {:data data :act-type "relu"})
    (sym/fully-connected "fc3" {:data data :num-hidden 10})
    (sym/softmax-output "softmax" {:data data})))
user/get-symbol

It will print the Training accuracy as it goes along

(def my-module (m/module(get-symbol)))
user/my-module
(m/fit my-module {:train-data train-data 
                  :eval-data test-data
                  :num-epoch 5})
4[org.apache.mxnet.module.Module,"0x2d733bfa","org.apache.mxnet.module.Module@2d733bfa",{1}]

4.
Wrap Up

This is a brief tour of MXNet. There is much more. Please check out the docs and examples.

You might also want to check out these cool NextJournal articles on MXNet GANs and a setup for MXNet GPU

If you need help, just shout out in the Clojurians slack channel #mxnet - or open an issue https://github.com/apache/incubator-mxnet/issues.