@blueberry there may be a bug when creating large vectors (neanderthal-native/dv (take 100000000 (range)))
I get GC out of memory.... I know of Clojure hanging onto the head of Lazy Sequences resulting in such behaviour. Im using Macbook Pro with 8g of ram.
@adebesin works on my machines like a charm, and fast too. This is what I tried: (dv (range 100000000))
@adebesin please check your JVM settings. It may be that JVM is limited in memory, and is not allowed to access all 8 GB. The only constraint that is there in neanderthal is the constraint that all Java buffers and arrays have, and it is 2 billion ellements in a raw structure.
@blueberry It worked for me also.. but if I try to run multiple times (4) I get OutOfMemoryError Direct buffer memory java.nio.Bits.reserveMemory (Bits.java:658)
must have been what was happening
I tried switching IDE's from Intellij & Cursive to Emacs and it seemed to help a little as IntelliJ hogs alot of resources but still managed to get out of memory
Ill have a look at JVM settings
P.S it was fast when it worked 🙂
@adebesin Ahaaaaaa. But, that's another thing. See, these use direct memory, which means that GC DOES clean it up, but only when it cleans up the whole vector. If you intend to create lots of huge vectors, you will fill up the memory before GC gets up to start any cleaning (because you only ever created a handful of objects), and there is no way in JVM to order GC cleaning. That's why you have to help him a bit, by calling release when you do not need that vector any more, or use with-release macro. But, anyway, creating and destroying lots of 1GB vectors may not be a great idea anyway. Think about reusing that structure.
@adebesin these options may be also helpful: "-XX:MaxDirectMemorySize=8g" "-XX:+UseLargePages"
aaah got it 🙂 it was just as I was trying stuff in the REPL i wasn't necessarily creating lots of large vectors. Anyways I will take your advice.. thanks I will see if the JVM settings work