Tutorials inbound? ๐
soonish
@mkvlr I've just checked and GPU is not advertised as being included in the free plan. Is it just available on a case-by-case basis until it becomes too popular to be unviable, or you plan to offer it publicly? In any case, it is an awesome feature, especially for testing and demonstrations. I would be interested in porting a few tutorials that demonstrate how to do the GPU stuff in Clojure (not immediately due to work overload, but in a month or two). Would it be viable, since it only makes sense if a large(ish) number of people are able to try it, but I'm afraid that it would be a drag for your resources?
Do you think libpython-clj is production ready?
I don't see why not - the link above from the facial rec shows you how to put it together with a conda/docker container
The surface area of the interop library itself is pretty small
after that - it's a matter of do you think python libs are prod ready ๐
But - you would be on the bleeding edge - which implies the chance of blood
Depends on your risk tolerance
I understand, I think python envs is a realy mess comparing with clojure
mainly about dependencies
Yes but it is a well known mess
I couldnยดt use CUDA libs with python yet because conda/python3.6/python3.7 ... make me very confuse
beside using it in Windows SO
I'm not arguing with you ๐ - It's another option in the tradeoff matrix that you have to make on when deciding how to build things
@fabrao - I think using GPU from windows is adding another major layer of complexity. The pathway through GPU docker with ubuntu is at least well known and tested. Then when you deploy things you are at least not changing operating systems. In my opinion, the major simplification to all of this is docker. Without that you are really fighting a lot of environment issues from cuda driver versions to python versions to a lot of other things and these things will make it really hard to move forward. So docker is your friend here but GPU docker and windows...not a good mix from my experience. GPU docker and Linux - doable but also has some caveats. Once that is working then layering gpu-enabled conda, mxnet, jvm, potentially neanderthal and clojure should also be within reach. Hmmm. Maybe that should be a github project right there. So - if you want GPU then you are going to have environment battles whether you use python or clojure. Windows makes these things much harder. Docker is designed specifically to solve the environment problem. Docker with GPU is a bit more involved that docker without GPU but it does work fine. Do you intend to test/deploy on Azure?
I will intend to deploy to desktop based running.
How do you work with CV with Cuda for local processing?
Will docker see gpu in this enviroment?
Is there any docs on how to use GPU docker containers with Clojure/lib-python-clj or Nehandertal? I always get into trouble just to install CUDA DevKit on Ubuntu.
I just got the Huggingface GPT2 text generation working ๐ https://gist.github.com/gigasquid/f276693bf3519a98afd4ed722edf55ec
Just putting it out here for now - will put together a blog post to go over it more in detail
(generate-text "Clojure is a dynamic, general purpose programming language, combining the approachability and interactive" 20)
;=> "Clojure is a dynamic, general purpose programming language, combining the approachability and interactive. It is a language that is easy to learn and use, and is easy to use for anyone"