Hi there, I am using Datomic Cloud. I would like to compile the code in my CI pipeline before deploying it, to save time and money. Can anyone tell me how Datomic Cloud invokes the compiler, and if it's reproducible?
Hi @hadilsabbagh18. Are you writing ion code that runs inside a cluster node?
If you compile your code before deploying it to an ion, it will load into the cluster node faster, but I am not sure that will save you a visible amount of time or money.
@stuarthalloway I have deployed code that has had Java compiler errors, which costs time and money. I am just try to pre-compile the code to make sure that it will pass.
Do you mean Clojure compiler errors? The cluster node does not compile Java for you.
Yes, I mean Clojure compiler errors...
You have some options:
If you are already going to the trouble of running the compiler locally, then you deploy make a jar with the compiled code instead of with source. Then there is no compilation on the cluster node, and no possibility of (that class of) error.
In that case the cluster node will also start faster after an ion deploy, although the difference may not matter much.
How would I indicate to the Ion deployment that I have already compiled my code into a jar? I can figure our that part...
Good news: you don't have to.
Jars are jars are jars
ok. So I just declare
:gen-class in my code and compile ir?
Have your ion depend on your compiled code as a maven dep.
You definitely do not need
This leads to a two-project structure, where your code is in one project, and your ion has deps on that code and probably just ion-config.edn.
Aha! Interesting idea!
I do this all the time. As soon as code is nontrivial I want to use it from more than one ion.
To get the compilation benefit, you still need to do whatever maven/leiningen/boot magic you need to compile all your Clojure code in the code project.
Can I use maven with tools.deps.alpha?
For some definitions of "use", yes 🙂
This space is evolving https://github.com/clojure/tools.deps.alpha/wiki/Tools#packaging
is the dev-local client compatible with on-premise client? (ignoring the features that on-premise support that cloud doesn't)
What's the idiomatic way to model something like a link table but against multiple other entities? in old datalog/prolog you'd do something like
attrName(entity1, other1, other2, other3). assuming entity1, other1, etc are either scalar values or entity ids.
but in datomic's datalog, if vecs are allowed as a value in a datom, you might be able to do something like this
or if not, you could... maybe this is how you'd do it?
[entity1 :attrName [other1, other2, other3]]
[entity1 :attrName1 other1] [entity1 :attrName2 other2] [entity1 :attrName3 other3]
attrNameis meant to be something that must join entity1 with 3 other entities, rather than it representing an unordered collection of linked entities, like the
:movie/castattr in http://learndatalogtoday.org
:attrName* express the same relation?
yeah. maybe i should have come up with a better concrete example for this...
boughtHouse(buyer, seller, house, notary). maybe? i don't actually know how houses are sold haha
why is this different from having separate ref attributes? each assertion has a different meaning
maybe the orientation of what an entity is can be reversed?
[house-sale-eid :housesale/buyer buyer-eid] [house-sale-eid :housesale/seller seller-eid] [house-sale-eid :housesale/house house-eid] [house-sale-eid :housesale/notary notary-eid]
^^ this is what I would expect
i don't know if it's different - i'm totally new to this and only have a background in dimensional modelling, datavault, and datalog
I think you’re getting at something though. Is it maybe a constraint you’re trying to enforce?
i definitely know that i want some constraints to be enforced, but i don't know what the term means in datomic's context yet 😬
yeah i guess orienting the entity around the event and not the buyer, or whatever the 'primary subject' of the event is is how you'd avoid having more than one instance of an entity for a given field