datahike

https://datahike.io/, Join the conversation at https://discord.com/invite/kEBzMvb, history for this channel is available at https://clojurians.zulipchat.com/#narrow/stream/180378-slack-archive/topic/datahike
magra 2020-02-13T16:41:25.097100Z

Hi! If I define a schema of e.g. 130 datoms and then people add data to datahike the data will start at db/id 131. If I then extend, not change, just extend the schema it will get db/ids of eg. 2400. If people add one of the newly defined attributes to a datom below 2400 such a database will not be re-importable once exported. If I follow the suggestions in the docs on github I export and re-import loading the new schema first. In this case the new schema will need more than 130 datoms. How do I solve this without breaking all the refs of the data users entered? Should the schema perhaps start with a different number than 1 which competes with the data users enter? Please tell me I am missing something.

magra 2020-02-13T17:23:47.098500Z

or: does the load have to be in the same order as the :db/id's? Would it be enought to pull the schema datoms to the front of the export-file without having to change the :db/id numbers?