Hot off the press from AWS - deep learning library for Java
Havenβt looked at it deeply yet - but looks like another avenue for us to leverage
From the contributor list - a lot of MXNet AWS folks involved
It looks like this might be the official announcement: https://towardsdatascience.com/introducing-deep-java-library-djl-9de98de8c6ca
;; deps.edn
{:deps {org.apache.logging.log4j/log4j-slf4j-impl {:mvn/version "2.12.1"}
ai.djl.mxnet/mxnet-model-zoo {:mvn/version "0.2.0"}
ai.djl.mxnet/mxnet-native-mkl$osx-x86_64 {:mvn/version "1.6.0-a"}}}
;; example
(ns clj-djl.core
(:require [<http://clojure.java.io|clojure.java.io> :as io])
(:import (ai.djl.modality.cv.util BufferedImageUtils)
(ai.djl.mxnet.zoo MxModelZoo)
(ai.djl.training.util ProgressBar)
(ai.djl.modality.cv ImageVisualization)
(javax.imageio ImageIO)))
(defn example []
(let [img (BufferedImageUtils/fromUrl "<https://raw.githubusercontent.com/dmlc/web-data/master/gluoncv/pose/soccer.png>")]
(with-open [model (.loadModel (MxModelZoo/SSD) (ProgressBar.))]
(let [predict-result (-> model
(.newPredictor)
(.predict img))]
(ImageVisualization/drawBoundingBoxes img predict-result)
(ImageIO/write img "png" (io/file "ssd.png"))))))
I got a picture π
the Java API interops nicely
learned how to do maven classifier in deps.edn too π
forgot to close Predictor in above, seems it implements AutoCloseable
not meaning to start a wrapper, but put the above example into a repo: https://github.com/viesti/clj-djl/blob/master/src/clj_djl/core.clj
the Java API might not need a wrapper at all, since interop works nicely
Great! Thanks for blazing the way in trying it out :) I plan to check it out too in the near future
It seems great. (Wished it was more functional though) xD
More tinkering (was looking at MXNet tutorials: https://mxnet.apache.org/api/python/docs/tutorials/getting-started/crash-course/1-ndarray.html):
clj-djl.core> (do (import (ai.djl.ndarray NDManager))
(import (ai.djl.ndarray.types Shape))
(with-open [nd-manager (NDManager/newBaseManager)]
(println (.randomUniform nd-manager 1 -1 (Shape. [2 2])))))
#object[ai.djl.mxnet.engine.MxNDArray 0x7b62d17b ND: (2, 2) cpu(0) float32
[[ 0.1527, -0.2471],
[-0.2918, 0.2312],
]
]
seems that djl has a JNA wrapper around MXNet
which is quite interesting, they have a tool to generate JNA mappings from the MXNet C header file: https://github.com/awslabs/djl/tree/master/mxnet/jnarator
I wonder how this compares to the hand-written JNI in the Scala/Java bindgins of MXNet: https://github.com/apache/incubator-mxnet/blob/master/scala-package/native/src/main/native/org_apache_mxnet_native_c_api.cc
this generator tool reminds me of SWIG (http://www.swig.org/), although that generates straight JNI from C headers
adopting new features from MXNet might be faster via a generator
interesting - @chris441 brought up JNA for TVM / MXNet a bit ago https://discuss.tvm.ai/t/clojure-bindings-for-tvm/1127
This π > We would also like to see more java integrations to DMLC projects that allow easier access to multiple languages; ones that force unnecessary dependencies on the scala compiler also force dependencies on the root jvm runtime (mxnet, we are looking at you).
The low level mappings are great - but there is a lot of other work built on the higher inference / training / dataset stuff too
At least in the main MXNet library
That said - it definitely is a nice accessible point of integration
definitely could be useful for new stuff
Please feel free to keep experimenting and pursue any good avenues. I donβt have a lot of time just now to dive in. Probably wonβt have any until after the holidays π