LevelDB is ready to rumble: https://github.com/alekcz/konserve-leveldb
did you do it from scratch?
Not entirely. I reused a lot of the code from the other konserve backends I worked on.
it did take a while get the linear storage layout sorted though
And RocksDB https://github.com/alekcz/konserve-rocksdb
really cool. I want to get into backends soonish as well. I want to write a cassandra-konserve sometime...
Happy to chat about how I approach it when you're ready to get started
will do, thanks!
You mean Reeeeady to REPL! π
I am trying to use datahike server, and I can get to the main to run, but I can't get to create a single transaction
I basically stop at the coercion all the time.
Would you have an example ?
I get a java.class.Assertion error.
Moreover, I think the uberjar from the repository does not hold the latest dependencies and breaks.
So I made it work, but only with application/edn. I wondered if it would be possible to make it work with json?
Hi @neo2551 thanks a lot for testing it!
In case it is not mentioned on the repo it is still in early stages.
So it is strange: the datahike server works as expected if I use application/edn, but once I communicate application/json, I have a bad entity
[{:age 35, :name David} {:age 20, :name Myriam}] 20-09-23 19:34:40 amd-3700x ERROR [datahike.db:1054] - Bad entity value 35 at [nil nil :age 35 nil] , value does not match schema definition. Must be conform to: (= (class %) java.lang.Long) {:error :transact/schema, :value 35, :attribute :age, :schema {:db/unique :db.unique/identity, :db/valueType :db.type/long, :db/cardinality :db.cardinality/one, :db/ident :age}}
@neo2551 which client are you using to send the json reqs?
I was using python, with the requests module.
We will be working on JSON support very soon
No problem, I feel honored to help you π
and I think there is some kind of problem with JSON right now. I did not investigate it further but some kind of types or namespaces will not be translated correctly right now.
I also had a problem with a jdbc backend: I had to add the dependency manually in datahike-server.core, otherwise it would no be found.
But I will continue to study datahike, but it seems my team are going to try to use it in a polyglot setting as well (python and R).
That's our aim for the remote server, so people from other environments may use Datahike. For me the question for JSON is about how to represent datalog in it and how to handle namespaces and keywords.
I would enforce β:namespace/nameβ in keys of json.
I donβt really know how this would interact with Clojure though.
Maybe we could use transit in other languages?
Thanks a lot for your work.
Great David. That is very good to hear. We are working a lot on improving Datahike and Datahike server is a very important project for us.
If you can describe your problem with JDBC a bit more in detail I could investigate it some more. But we will do some more testing with JDBC as well of course.
I tried to use Datahike server with a jdbc backend [if you could migrate to deps.edn it would be nice as well], and the Uber jar did not want to start because Datahike-jdbc.core was not required (as it was AOT compiled I guess?).
I solved it by requiring Datahike-jdbc in main.
Actually we do migrate to clj-tools with deps.edn for the CLJS-support that is in the works.
π