Ask questions on the official Q&A site at https://ask.datomic.com!
joshkh 2020-09-17T07:01:32.481900Z

the example function inc-attr at the top of this thread is actually a transactor function taken from the Datomic documentation. and according to the same documentation, they must be pure. so perhaps inc-attr is pure (in this context) because transactor functions run "inside" of Datomic?

favila 2020-09-17T07:25:36.482100Z

I think “pure” is used loosely here to mean “I may execute this function multiple times while holding a db lock at the head of a queue of other transactions, and you are ok with the consequences of whatever you do in there”

Stefan 2020-09-17T09:04:30.484200Z

Hi all, I’m hoping that this is not a FAQ that I missed somehow, but this is what I’d like to accomplish: for my unit tests, I’d like to be able to pass a simple Clojure hashmap into Datomic query functions, instead of a real Datomic connection, so that I can test my queries without actually round-tripping to a database. Is there something out there to do this? Or am I on a wrong track here?

Stefan 2020-09-18T07:36:50.000600Z

@val_waeselynck Yeah we’re already experimenting with that, and maybe it’s good enough. But if those tests take 1 second each because of setup/teardown of Datomic databases, that’s too long for me. For unit tests, I prefer to keep things as lean as possible.

val_waeselynck 2020-09-18T13:11:39.003300Z

@stefan.van.den.oord forking solves that problem

val_waeselynck 2020-09-18T13:13:11.003500Z

Put a populated db value in some Var, and then create a Datomock connection from it in each test, there's virtually no overhead to this

Stefan 2020-09-18T13:23:33.003700Z

Sounds good, will definitely try, thanks! 🙂

pithyless 2020-09-17T09:36:24.484300Z

You can actually pass in a vector of datum tuples as your DB and query can unify them. But that's probably not what you're looking for. Why not just create an in-memory datomic connection? Something like:

(str "datomic:mem://" (gensym))
You may also be interested in a tool like https://github.com/vvvvalvalval/datomock for setting up and reusing more complex test data scenarios

pithyless 2020-09-17T09:38:26.484700Z

I suppose none of this is relevant if you're using Datomic Cloud. Seems to be a primary driver for releasing https://docs.datomic.com/cloud/dev-local.html


note that passing datum tuples only works for the peer library, not for the client library iirc.

Stefan 2020-09-17T10:10:21.485100Z

@pithyless We’re using on-prem, so… I’ve indeed found datomock, which is a nice concept, but then you still need to specify both a schema and the test data; I was hoping for something even simpler 😉

joshkh 2020-09-17T11:32:47.485300Z


joshkh 2020-09-17T11:33:48.485500Z

thanks for sharing your thoughts, it's always insightful to pick other peoples' brains.

souenzzo 2020-09-17T15:08:55.487700Z

I'm running datomic on-prem@dynamodb Can I generate a "AWS Event"¹ on every transaction? ¹ AWS Event is something that i can plugin into lambda/SNS/SQS


We use the transaction report queue to push data into a kinesis stream, then run lambdas on those events


Triggering side effects on Dynamo db writes is likely not what you want since datomic is writing full blocks to storage (not a datom at a time)

souenzzo 2020-09-21T15:12:30.040700Z

@bhurlow when running on multiple/scalled instances, how do you manage the tx-report-queue?


We run a single, global process which just subscribes to the queue and pushes events to kinesis


other Datomic traffic is scaled horizontally but doesn't invoke the queue


Kinesis -> Lambda integration works reasonably well


one bonus is you can do one queue to many lambda consumers

souenzzo 2020-09-21T15:14:36.041700Z

@bhurlow can you share which instance size you use for this report-queue?


subscribing to the tx report queue and putting into lambda is not a very intensive process


t3.large would be fine imo

souenzzo 2020-09-21T15:19:04.042700Z

tnks @bhurlow

joshkh 2020-09-21T15:28:31.042900Z

hmm, i don't suppose you know if something like the "transaction report queue" is available on Datomic Cloud, do you? i have often been in need of exactly what souenzzo mentioned, but instead settled for querying / sipping the transaction log on a timer


I'm not sure about cloud, have only used the above in on-prem


I'd assume it's inside the system but possibly not exposed

val_waeselynck 2020-09-21T15:37:31.043500Z

Clients don't have a txReportQueue indeed. Polling the Log is usually fine IMO (and having a machine dedicated solely to pushing events seems wasteful, and it's also fragile as it creates a SPoF).

souenzzo 2020-09-21T15:38:55.043700Z

I work with datomic cloud and datomic on-prem (on different products) IMHO, datomic on-prem still way easier/flexible then cloud Cloud has to many limitations. You can't edit IAM for example, and if you edit, you break any future updates.

joshkh 2020-09-21T15:40:07.044Z

thanks guys

val_waeselynck 2020-09-21T15:42:25.045900Z

One interesting construction might be using AWS Step Functions + Lambda for polling the Datomic Log into Kinesis, using the Step Functions state to keep track of where you are in consuming the Log

val_waeselynck 2020-09-17T17:48:00.488500Z

@stefan.van.den.oord I find this use case strange. Wouldn't you have more confidence in your tests if they ran in an environment more similar to production?

val_waeselynck 2020-09-17T17:49:15.488800Z

I personally find it hugely advantageous to have a full-featured implementation of Datomic in-memory, I would recommend embracing it

donyorm 2020-09-17T21:00:55.489400Z

Looking to try and figure out how to handle sessions/authentication with ions, is there a best practice for that in ions? https://forum.datomic.com/t/best-way-to-handle-session-in-ions/1630

kenny 2020-09-17T23:21:44.000500Z

Just confirming, it's okay to pass a db created with datomic.client.api/db to datomic.client.api.async/q, correct?