the example function inc-attr
at the top of this thread is actually a transactor function taken from the Datomic documentation. and according to the same documentation, they must be pure. so perhaps inc-attr
is pure (in this context) because transactor functions run "inside" of Datomic?
I think “pure” is used loosely here to mean “I may execute this function multiple times while holding a db lock at the head of a queue of other transactions, and you are ok with the consequences of whatever you do in there”
Hi all, I’m hoping that this is not a FAQ that I missed somehow, but this is what I’d like to accomplish: for my unit tests, I’d like to be able to pass a simple Clojure hashmap into Datomic query functions, instead of a real Datomic connection, so that I can test my queries without actually round-tripping to a database. Is there something out there to do this? Or am I on a wrong track here?
@val_waeselynck Yeah we’re already experimenting with that, and maybe it’s good enough. But if those tests take 1 second each because of setup/teardown of Datomic databases, that’s too long for me. For unit tests, I prefer to keep things as lean as possible.
@stefan.van.den.oord forking solves that problem
Put a populated db value in some Var, and then create a Datomock connection from it in each test, there's virtually no overhead to this
Sounds good, will definitely try, thanks! 🙂
You can actually pass in a vector of datum tuples as your DB and query can unify them. But that's probably not what you're looking for. Why not just create an in-memory datomic connection? Something like:
(str "datomic:mem://" (gensym))
You may also be interested in a tool like https://github.com/vvvvalvalval/datomock for setting up and reusing more complex test data scenariosI suppose none of this is relevant if you're using Datomic Cloud. Seems to be a primary driver for releasing https://docs.datomic.com/cloud/dev-local.html
note that passing datum tuples only works for the peer library, not for the client library iirc.
@pithyless We’re using on-prem, so… I’ve indeed found datomock, which is a nice concept, but then you still need to specify both a schema and the test data; I was hoping for something even simpler 😉
agreed!
thanks for sharing your thoughts, it's always insightful to pick other peoples' brains.
I'm running datomic on-prem@dynamodb Can I generate a "AWS Event"¹ on every transaction? ¹ AWS Event is something that i can plugin into lambda/SNS/SQS
We use the transaction report queue to push data into a kinesis stream, then run lambdas on those events
Triggering side effects on Dynamo db writes is likely not what you want since datomic is writing full blocks to storage (not a datom at a time)
@bhurlow when running on multiple/scalled instances, how do you manage the tx-report-queue?
We run a single, global process which just subscribes to the queue and pushes events to kinesis
other Datomic traffic is scaled horizontally but doesn't invoke the queue
Kinesis -> Lambda integration works reasonably well
one bonus is you can do one queue to many lambda consumers
@bhurlow can you share which instance size you use for this report-queue?
subscribing to the tx report queue and putting into lambda is not a very intensive process
t3.large would be fine imo
1🦜tnks @bhurlow
hmm, i don't suppose you know if something like the "transaction report queue" is available on Datomic Cloud, do you? i have often been in need of exactly what souenzzo mentioned, but instead settled for querying / sipping the transaction log on a timer
I'm not sure about cloud, have only used the above in on-prem
I'd assume it's inside the system but possibly not exposed
Clients don't have a txReportQueue indeed. Polling the Log is usually fine IMO (and having a machine dedicated solely to pushing events seems wasteful, and it's also fragile as it creates a SPoF).
I work with datomic cloud and datomic on-prem (on different products) IMHO, datomic on-prem still way easier/flexible then cloud Cloud has to many limitations. You can't edit IAM for example, and if you edit, you break any future updates.
1👍thanks guys
One interesting construction might be using AWS Step Functions + Lambda for polling the Datomic Log into Kinesis, using the Step Functions state to keep track of where you are in consuming the Log
@stefan.van.den.oord I find this use case strange. Wouldn't you have more confidence in your tests if they ran in an environment more similar to production?
I personally find it hugely advantageous to have a full-featured implementation of Datomic in-memory, I would recommend embracing it
Looking to try and figure out how to handle sessions/authentication with ions, is there a best practice for that in ions? https://forum.datomic.com/t/best-way-to-handle-session-in-ions/1630
Just confirming, it's okay to pass a db created with datomic.client.api/db
to datomic.client.api.async/q
, correct?