datomic

Ask questions on the official Q&A site at https://ask.datomic.com!
plexus 2021-05-24T08:14:53.040500Z

I'm having trouble trying to query Datomic Analytics / Presto via JDBC. I have a decimal(38,2) field, and it's causing an exception in the Presto JDBC driver.

(def conn (java.sql.DriverManager/getConnection presto-url "." ""))

(let [stmt (.createStatement conn)]
  (.executeQuery stmt "SELECT credit_amount FROM journal_entry_line"))

;;=>
1. Caused by java.lang.IllegalArgumentException
   ParameterKind is [TYPE] but expected [LONG]

TypeSignatureParameter.java:  110  com.facebook.presto.jdbc.internal.common.type.TypeSignatureParameter/getValue
TypeSignatureParameter.java:  122  com.facebook.presto.jdbc.internal.common.type.TypeSignatureParameter/getLongLiteral
           ColumnInfo.java:  194  com.facebook.presto.jdbc.ColumnInfo/setTypeInfo
      PrestoResultSet.java: 1869  com.facebook.presto.jdbc.PrestoResultSet/getColumnInfo
      PrestoResultSet.java:  123  com.facebook.presto.jdbc.PrestoResultSet/<init>
      PrestoStatement.java:  272  com.facebook.presto.jdbc.PrestoStatement/internalExecute
      PrestoStatement.java:  230  com.facebook.presto.jdbc.PrestoStatement/execute
      PrestoStatement.java:   79  com.facebook.presto.jdbc.PrestoStatement/executeQuery
If I cast(credit_amount AS varchar) then it works. `getLongLiteral` looks suspicious since it's a decimal field... Not sure if the issue lies with Presto, Datomic Analytics, or the Presto JDBC driver. So I'm mainly asking: what would be the best place(s) to report this?

plexus 2021-05-24T08:20:23.041600Z

This seems to be central line in the stacktrace: https://github.com/prestodb/presto/blob/2ad67dcf000be86ebc5ff7732bbb9994c8e324a8/presto-jdbc/src/main/java/com/facebook/presto/jdbc/ColumnInfo.java#L194

case "decimal":
                builder.setSigned(true);
                builder.setColumnDisplaySize(type.getParameters().get(0).getLongLiteral().intValue() + 2); // dot and sign
                builder.setPrecision(type.getParameters().get(0).getLongLiteral().intValue());
                builder.setScale(type.getParameters().get(1).getLongLiteral().intValue()); // <----- getLongLiteral ->  ParameterKind is [TYPE] but expected [LONG]

futuro 2021-05-24T14:56:12.043100Z

I'm splitting my initial Marketplace master Datomic Cloud stack into a split-stack solo topology. I didn't provide an ApplicationName in my initial setup from the Marketplace (so the System Name is used, as I understand it); should I provide one now?

futuro 2021-05-24T14:56:46.043900Z

Have folks found it beneficial to provide the ApplicationName even when it's the same as the SystemName?

2021-05-24T16:05:22.045400Z

Say I want to connect to the in-memory database of an on-prem datomic peer server. What’s the value to specify for :endpoint? (see https://docs.datomic.com/client-api/datomic.client.api.html#var-client)

2021-05-24T16:14:53.045900Z

Fundamentally, I want to perform unit test with datomic (on prem).

futuro 2021-05-24T16:26:33.046500Z

As a docs heads up, there's an empty bullet-point at https://docs.datomic.com/cloud/getting-started/configure-access.html#authorize-gateway

1👍
2021-05-24T16:27:20.046800Z

One issue with spinning up an in-memory peer server and connecting to it is that, the peer server listens on a TCP port. So we cannot really run two instances of the test because the two collides on the same port.

cjsauer 2021-05-24T18:46:54.049200Z

Is it a bad idea to rely on :db/txInstant as the “created at” time for an entity? The instant at which an entity’s :thing/id datom was asserted is a nice natural creation date, but I’m getting the sense that I’m abusing it a bit. For example, I can’t use index-pull to get the “10 latest” things, because that txInstant datom is on a separate entity (the transaction)…

joshkh 2021-05-25T08:41:25.050900Z

we tend to explicitly add dates because the historical data is not accessible to our external integrations via Datomic Analytics. also, if you ever replay your tx log from one database to another then the dates of the transactions will differ

favila 2021-05-25T12:27:58.052900Z

@joshkh “if you ever replay your tx log from one database to another then the dates of the transactions will differ” that’s not completely correct. the :db/txInstant assertion is in the tx log, so it will copy over unless you filter it out

favila 2021-05-25T12:29:11.053100Z

The use case for allowing this is to back-date data, but the tx/Instant of any new transaction must be >= the previous one, so this technique is limited to fresh databases

joshkh 2021-05-25T12:30:39.053600Z

that's news to me, thanks for the correction @favila

2021-05-24T18:50:18.049500Z

If the entity is created and submitted by an external system, it’s best to require a creation/event time as an input and to verify that is at a point in the recent past.

cjsauer 2021-05-24T18:51:33.049700Z

That’s a good point. Or I suppose import jobs are another reason why one shouldn’t overload the :db/txInstant attribute. It’s really more “this is when it entered the system”, whereas creation time is a domain concern.

2021-05-24T18:52:22.049900Z

> wall clock times specified by `db:txInstant` are imprecise as more than one transaction can be recorded in the same millisecond

2021-05-24T18:53:28.050100Z

you would want to set txInstant on imports, too https://docs.datomic.com/cloud/best.html#set-txinstant-on-imports

2021-05-24T18:54:29.050300Z

even in systems with a RDBMS I like users of a system to provide specific times with their data and also record transaction timestamps

cjsauer 2021-05-24T19:01:17.050500Z

What if the entity is created by users? Should I be managing created-at/`updated-at` times manually?

cjsauer 2021-05-24T19:10:01.050700Z

Ah found some good material on the matter: https://vvvvalvalval.github.io/posts/2017-07-08-Datomic-this-is-not-the-history-youre-looking-for.html