datascript

Immutable database and Datalog query engine for Clojure, ClojureScript and JS
Filipe Silva 2019-11-11T20:39:49.023400Z

heya, sorry if this is a silly question (I am not very familiar with datascrip), but are there good examples of how to use datascript and firebase together? I can't find any promising google results for it so I imagine it's likely an unadvisable thing to do

Filipe Silva 2019-11-12T09:18:35.027300Z

awesome, thank you

Filipe Silva 2019-11-12T09:18:47.027500Z

I see how the 10mb limit is a problem though...

Filipe Silva 2019-11-12T09:22:37.027700Z

firestore has a 1mb document limit even so it wouldn't be better for that setup

Filipe Silva 2019-11-12T09:23:02.027900Z

did you consider/try doing something more granular than a full db serialization?

Filipe Silva 2019-11-12T09:23:21.028100Z

like a conversion layer into firebase documents

jbrown 2019-11-12T19:35:10.034700Z

We've considered doing something like making each entity a key in firebase and caching the entities on the client so we only have to load the entities that have changed but we would still have to deserialize the entities from local storage and transact them into a db, which I haven't tested but I think takes longer than deserializing a whole db and calling conn-from-db. Maybe this approach would be faster on slow connections, but I'm not sure. In other words I haven't figured out an architecture that would 100% work better than storing the whole db. And if the goal is to just have a db larger than 10mb, we can use firebase storage api to store the db, since we don't actually need the functionality of realtime database for the datascript db store.

Filipe Silva 2019-11-12T19:36:15.034900Z

gotcha

Filipe Silva 2019-11-12T19:36:35.035100Z

thanks for being so responsive and providing such a great sample

jbrown 2019-11-12T19:36:56.035300Z

No problem, I've been wanting to write this up for awhile now 🙂

Filipe Silva 2019-11-12T19:37:02.035500Z

I'll see if I can get my head around it and maybe use something similar

Filipe Silva 2019-11-12T19:37:46.035700Z

btw I find roamresearch quite interesting! I've had something in the same problem space in my head for a while

Filipe Silva 2019-11-12T19:38:05.035900Z

so it's really cool to see how roamresearch and other tools are trying to improve note taking

Filipe Silva 2019-11-12T19:39:23.036100Z

worth mentioning that the Pixel phones (maybe others, not sure) have two cool things that help with the data input: you can talk to the recorder app and it'll transcribe, and you can take pictures of notes and it'll try to OCR the text, handwriting included

Filipe Silva 2019-11-12T19:39:30.036300Z

both of these work offline

jbrown 2019-11-12T19:50:04.036500Z

Glad to see other people interested in this space as well! thats pretty cool, we've thought of integrating with http://otter.ai (really great transcribing app), didn't know phones come with things like that (I have an old galaxy phone haha)

jbrown 2019-11-12T19:50:40.036700Z

I'm curious what your looking to use datascript for

Filipe Silva 2019-11-12T20:08:01.037100Z

similar thing as you guys have, but instead of focusing on the research field I was focusing on software engineers that wanted to do deep work

Filipe Silva 2019-11-12T20:09:14.037300Z

was looking at supplementing deep work with graph-based note taking with full-text search

Filipe Silva 2019-11-12T20:09:48.037500Z

a big driver was to have a persistent context between work sessions

Filipe Silva 2019-11-12T20:10:11.037700Z

and I wanted something firebase/datascript based because of the implicit transaction history

Filipe Silva 2019-11-12T20:11:25.037900Z

I think allowing users to replay how the last work session actually happened in the system would help get people back in the right context and mindset

jbrown 2019-11-13T05:18:34.041Z

It's awesome to meet someone with such a cross section of interests as ours 🙂. We originally wanted to use the transaction history to replay actions on a database (both to replicate actions that led to a bug and to show the user), but it hasn't really worked in practice with our setup because the db/ids can get out of sync between clients (we mostly have this solved now by using a unique identifier for every entity instead of the db/id, but now the logs are all messed up and can't be replayed)

❤️ 1
Filipe Silva 2019-11-13T10:46:58.041300Z

yeah log changes often break replays in a lot of systems 😞

Filipe Silva 2019-11-11T20:43:24.024Z

I see https://github.com/mrmcc3/datascript-firebase-sync but as the commit says it mostly looks like a code dump, so I'm not sure how well done it is

2019-11-11T20:58:18.025Z

@filipematossilva ^ I believe@conaw did something like this.

jbrown 2019-11-11T21:31:38.025100Z

Hey I work with @conaw, we use datascript and firebase together in production at http://roamresearch.com. It certainly has its limitations, but it works well for a small/medium size datascript database. We store the datascript database serialized in firebase, load that into memory and listen/push transactions to a log in firebase so it will sync between connected clients. Can post a modified version of our code later tonight.

Filipe Silva 2019-11-11T21:37:03.025300Z

Oh wow that's a lot more than I was expecting when I asked!

Filipe Silva 2019-11-11T21:37:24.025500Z

If you can that'd be great! Thank you!

Filipe Silva 2019-11-11T21:38:00.025700Z

Do you use real-time database or forestore btw?

Filipe Silva 2019-11-11T21:38:52.026500Z

Jbrown just followed up on that, super helpful!

jbrown 2019-11-11T22:20:47.026600Z

real-time database. The values are limited to 10mb though, so we are going to switch to using storage for the database snapshot and realtime database for the tx log. It's been really nice to prototype our app with this setup, but we are running into problems with the databases getting too big (taking too long to download and read string) so we'll have to switch to datomic/datahike/crux soon. Also if you are planning on having multiple clients editing the same datascript database you'll have to transact a uid with each entity because transactions aren't guaranteed to come in order and db/ids can get out of sync between the clients.