@ronb you mentioned that your version of Datascript is not backwards compatible. What are the breaking changes?
@kiemdoder Attributes are stored as Integers (d/datom 0 :attr :val) is not allowed anymore Queries return attributes as Integers, not Keywords You need to define the Attribute order in the schema -> {:attr {:db/order 0}} Transaction Id’s are created from 2^24 -1 downwards New range for Entity ID’s is from 0 to 2^24 New range for valid Transaction ID’s is from 2^24 to (2^24) - 2^20
Every attribute is represented as an ID (similar to Datomic) All attributes need a db/order key in the schema
{:attr {:db/order 0} :attr2 {:db/order 1}}
More Details: https://github.com/tonsky/datascript/pull/263
Will do some benchmarking and better Docs. @tonsky Thanks for the reply on the pull request. I will reply later
I’d say, semantically, biggest setback for existing users would be:
— lack of datom
call with keyword attribute
— unexpected integers in query results (although I believe impl can be changed to return keywords)
— the need to specify all attributes in schema (:db/order could be probably calculated, but it requires all attributes to be specified explicitly)
@tonsky I completely agree. Right now the backwards incompatibilities are too big. If I address the issues you mentioned, would you be willing to merge the pull request or do you prefer a seperate project?
most incompatibilities could be addressed with code in db/-search and (transact-add), but changes in how (datom) constructor works would remain and probably be confusing to users
I don’t see how we can work around requiring that every attribute must be specified in schema. So far it’s not required and I believe most users rely on that
And then there’s JS story, which lacks 64-bit numbers so that schema wouldn’t work as well there :(
it feels like jvm-only, separate project with no backwards compatibility guarantees would be best
Yeah it would require additional logic in transaction handling to add new attributes on demand and is a bit messy 😄
i use 64 bit floats to guarantee compatibility with JS environments. All tests run successfully on JVM and JS as far as I can see
No problem. I will close the PR later. Thanks for the feedback
I was under impression JS bit operations only work on 32-bit ints? https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/Bitwise_Operators
You can cheat if you do the following: (/ num (js/Math.pow 2 32)) -> gives access to the upper part of float numbers in the lower 32 bit region
for changing the upper part you can do (+ (* upper (js/Math.pow 2 32)) lower-bits)
isn’t it like SUPER slow?
as far as I can see this is in conformance with IEEE Floating point spec
Will test the performance in the evening, but i compared this with bit-shift operators and they have similar performance. Looks like JS engine optimize (/ num some-binary-exponent) into (bit-shift-right )
^^ maybe helpful?
btw maybe instead of trying to fit everything into one 64-bit value, it might be easier to fit everything in two 32-bit values? memory consumption will be the same
that's what goog.math.Long
and therefore ^^ does
No, I mean, there’s no reason to pretend it should look like single 64-bit at any point
depends what you want to do
what @ronb is doing
yeah
@thedavidmeister Looks interesting, thanks for the link. I looked into using goog.math.Long based solution. but if i understand correctly how js engines work, it would use twice the memory of a single double number
although maybe i am mistaken and engines actually detect that you only use 32bits of the number and optimize the rest away
this benchmark compares a bitshift operation with division. and shows a 30% performance advantage for shift based operation. which is not significant enough in my use case to be detectable, as performance is largely memory bound
just tested memory usage of 32bit numbers and values that go beyond 32bits. they both use the same amount of memory
@ronb uploaded a file: https://clojurians.slack.com/files/U0HCV4QMQ/FAU456KMJ/memory_consumption_of_js_integers.js