datahike

https://datahike.io/, Join the conversation at https://discord.com/invite/kEBzMvb, history for this channel is available at https://clojurians.zulipchat.com/#narrow/stream/180378-slack-archive/topic/datahike
levitanong 2021-02-06T14:48:11.023200Z

Hi all! I'm getting a strange error in my production database of my toy app:

Error during transaction java.lang.IllegalArgumentException: No implementation of method: :-resolve-chan of protocol: #'hitchhiker.tree.node/IAddress found for class: clojure.lang.PersistentArrayMap
I've done a datahike.migrate/export-db and I can't see any particularly strange data in here. 😮 This only started happening after a couple of days of usage, so I'm guessing it has something to do with corrupted data entry.

timo 2021-02-07T08:41:53.025100Z

Hi @levitanong. Thanks for reporting! May I ask which version and what backend you are using?

levitanong 2021-02-07T08:58:41.025300Z

@timok hello! I'm on 0.3.3, on the postgres backend. Not using the one that's on the github though. Had to fork it so that it uses the new datahike config format or somesuch. (That might also be something for you guys to take a look at, btw.)

timo 2021-02-07T09:03:57.025500Z

Ok, that's a bummer. We are running integration tests against our datahike-postgres-backend.

timo 2021-02-07T09:06:52.025700Z

I did not find a fork of datahike-postgres under your name so it is hard for me to figure out what the problem might be right now. Can you show your backend? We will have to ask everyone using the postgres-backend to switch to the jdbc-backend soon because we will be continuing with that one soon. Maybe you want to already switch?

levitanong 2021-02-07T09:11:17.025900Z

Ah, I got lazy, so I just copied the entire core.clj file. 😆 this is the jdbc backend, yes? https://github.com/replikativ/datahike-jdbc

timo 2021-02-07T09:11:38.026200Z

correct

levitanong 2021-02-07T09:12:16.026400Z

I'll give it a try and see if the issue persists 🙂

timo 2021-02-07T09:12:54.026600Z

I am looking forward to hear from your experience. I did not try it myself yet.

timo 2021-02-07T09:14:06.026800Z

It might be necessary to do the import/export for that

levitanong 2021-02-07T11:10:36.029100Z

Haha it appears I’ll have to do import/export. I just tried it and the data seems to have disappeared. Or at least, it is inaccessible. 😆good thing I had already made an export.

timo 2021-02-07T11:53:13.029700Z

ok, should I have warned you explicitly to make an export before? Is it now working with jdbc?

levitanong 2021-02-07T12:51:36.030300Z

@timok I think the warning would've been nice, but I do remember reading something about exporting when upgrading versions. So that's mostly my fault for not keeping that top of mind 😅 But also yay me for exporting beforehand anyway, so no harm done! As for whether or not it's working, I suspect that what fixed the issue was more because switching over to the postgres-jdbc lib reset the db altogether, so whatever problematic data was in there got removed anyway. Whatever corruption that was, was likely in the db itself, but somehow was being ignored when exporting, which is why the exported data was fine.

levitanong 2021-02-07T12:53:05.030500Z

I'll update this thread if the problem arises again in the coming days!

timo 2021-02-07T12:56:47.030700Z

That's great! I am glad it worked out (for now). Thanks for telling me.

1🙏
levitanong 2021-02-07T13:59:10.030900Z

Thanks for the help! 😄

levitanong 2021-02-15T02:56:19.040200Z

Oh dear, it happened again 😞 This happened after I restarted my server following an entity reference error. Error during transaction clojure.lang.ExceptionInfo: Nothing found for entity id REDACTED {:error :entity-id/missing, :entity-id REDACTED} I'm not sure if the two are connected or coincidental.

Error during transaction java.lang.IllegalArgumentException: No implementation of method: :-resolve-chan of protocol: #'hitchhiker.tree.node/IAddress found for class: clojure.lang.PersistentArrayMap

timo 2021-02-15T08:46:41.040400Z

@levitanong Would you open an issue on github and describe what you did there? We are happy to try reproducing it and fixing any bug that comes up.

levitanong 2021-02-15T08:50:33.040600Z

Will do!

levitanong 2021-02-15T12:05:17.040800Z

Issue filed here, with an account of the first instance included. https://github.com/replikativ/datahike/issues/285

1👍
Jean 2021-02-06T17:18:25.024800Z

Hi guys! I’m testing datahike with datahike-jdbc on postgres. I found the transact a bit slower. It’s normal?

timo 2021-02-07T10:32:54.027Z

Hi @jean.boudet11, slower than what?

Jean 2021-02-07T11:24:19.029300Z

Hi @timok, I found the transact a bit slower (5s) for transact one item. I tested datahike with file storage and I have the same time for one item.

timo 2021-02-07T11:52:13.029500Z

I assume you are saying that you are comparing file-store with a jdbc-store. I think file-store is faster than jdbc most of the times but I would have to forward this to colleagues here on the channel. @whilo @konrad.kuehne @j.massa

Jean 2021-02-07T12:00:11.029900Z

My test with file storage took the same time than jdbc (for one item in transact)

timo 2021-02-08T15:18:30.036400Z

Hi @jean.boudet11. I saw you were opening an issue. May I ask you to provide some more details on your problem. We need some steps to reproduce it. Usually we don't see these kinds of lags when transacting. Maybe you can run a benchmark: https://cljdoc.org/d/io.replikativ/datahike/0.3.3/doc/contributing-to-datahike#starting-the-benchmark

Jean 2021-02-08T15:21:18.036700Z

Hi @timok, yeah I created an issue for historise my issue. Ok, I’m going to launch the benchmark to my computer.

timo 2021-02-08T15:22:59.036900Z

run it with TIMBRE_LEVEL=':info' clj -M:benchmark

timo 2021-02-08T15:23:07.037100Z

better to not have all the logs

timo 2021-02-08T15:23:15.037300Z

will adjust that in the docs

timo 2021-02-08T15:25:00.037500Z

https://gist.github.com/TimoKramer/bfd16a506e4950bda7b742b362e9e6f2 that's my benchmark a few seconds ago on my machine running a lot of stuff

Jean 2021-02-08T16:22:02.037700Z

Humm... I have this error when I run the benchmark