Hi @seancorfield, my library Gungnir uses Honeysql. One of the features it provides is allowing transformations of fields before a query is executed. For example if I were to query a user based on email, I can configure my setup to downcase the email string before executing the query. This is useful when emails are used as identifiers, and you want those emails to be downcased in all scenarios.
I currently achieve this by overriding the Honeysql comparison fmt/fn-handler
multimethods. This is of course a terrible hack, but I couldn't figure out how to solve this problem otherwise. Is there any chance V2 would have some sort of hook system to transform arguments before executing the query?
https://github.com/kwrooijen/gungnir/blob/8815a230fd439a97e1db970ad89b08a323b625db/src/clj/gungnir/query.clj#L295-L363
@kevin.van.rooijen Interesting approach... I'll have to give that some thought...
The new formatting engine would make that almost impossible to hook into I suspect.
@kevin.van.rooijen brings up an interesting point.
I have encountered a similar/related issue in a couple of ways:
The PostGIS example in the README shows the use of sql/call, and includes this: (sql/call :cast 4326 :integer)
to account for Clojure passing in 4326 as a long
and Postgres/PostGIS wants an integer.
I’ve been wondering if it might make more sense to convert the value to be int
before sending it over to the database….
At a higher level, I make frequent use of next.jdbc.prepare/SettableParameter
and next.jdbc.result-set/ReadableColumn
to transform data between my code and the database,
but recently I have begun using Malli schemas, and I make use of both schema defined key and value transformations to convert between database and Clojure representations.
I’ve been considering if it might be better to move my transformations out of the inards of next.jdbc, and drive them explicity through (Malli) schema-driven transforms.
Invoking these transforms through some potential hook in honeysql is an interesting idea, but I have two reservations: there are a few times when I don’t use honeysql to create my query (long story), and honeysql is only used to generate the query, and doesn’t particpate in the response, so not sure honeysql is the right place in this chain to initiate/drive (potentially bidirectional) transformations
The PostGIS stuff no longer needs any sql/call
calls in it: https://github.com/seancorfield/honeysql/tree/v2#postgis
If I was working with that stuff, I'd probably write some helper functions that took the PostGIS native values and expanded them to HoneySQL data structures. Remember that HoneySQL has no idea about databases and types: it's purely a DSL for describing SQL statements. Only next.jdbc
knows about databases and types.
I think that is what I have been doing, e.g.
(defn ST_MakePoint
[x y z]
(sql/call :ST_MakePoint x y z))
(defn ST_Transform
[g srid]
(sql/call :ST_Transform g (sql/call :cast srid :integer)))
(defn ST_SetSRID
[g srid]
(sql/call :ST_SetSRID g (sql/call :cast srid :integer)))
And then I can do things like:
(-> (ST_MakePoint lon lat alt)
(ST_SetSRID 4326)
(ST_Transform srid))
Wondering if I could replace (sql/call :cast srid :integer)
in those helpers with (int srid)
?
If I know it needs to be an int
, why send a long
to Postgres, and ask it to cast?
[:cast srid :integer]
is DSL for SQL syntax that gets turned into [" ... CAST( ? AS INTEGER ) ..." 4326] and it's Clojure that evaluates
srid` to a Long
as 4326. I think you're confusing two different sets of abstractions...
Even if you say (int srid)
that's going to get evaluated to 4326
before HoneySQL even sees it.
So what HoneySQL sees is [:cast 4326 :integer]
-- it's already a value by that point.
if it gets converted to an int
and sent to postgres as an int, then no need to ask postgres to cast it to an int. I understand these are two very different things
also realize this is 'not honeysqls job`, this is in my own helper....
Feel free to add more detail to https://github.com/seancorfield/honeysql/issues/276 @kevin.van.rooijen
Well, (int 4326)
is going to produce a boxed java.lang.Integer
by the time it has gone through HoneySQL (I just verified that), based on Clojure's evaluation rules. So what next.jdbc
sees is Integer
and that is going to be passed to PostgreSQL JDBC via .setObject
unless you override SettableParameter
for Integer
and make it use a different setter method.
Good to know!
If ST_SetSRID
can't cope with that and still requires the CAST
that's between JDBC and ST_SetSRID
at this point 🙂
I'd like to figure out a way to pass metadata through, untouched, so that could be used to drive parameter setting -- but the problem there is that you need some sort of Clojure "object" to attach metadata to (some of the type-based helpers in next.jdbc
wrap values in a vector to support that) but HoneySQL sees the vector of values and tries to unpack them and you lose the metadata and end up with plain values again.
I'm really trying hard to avoid going down the path that v1 used -- all those special record types to communicate how values should be treated.
@seancorfield have you thought of foregoing values entirely and use named parameters?
So that you can construct queries with say ?foo
and then get a data structure with those named parameters inside. Which you could then walk to replace with actual values?
I’m on mobile so I can’t articulate exactly what I mean... but basically it bothers me to mix the “static” part of the query with the “dynamic” part (the values)
I guess now that I write that I could probably do this outside honeysql somehow?
V1 has machinery to do that (if I'm understanding you correctly): :?foo
is a placeholder that will get populated via format
when you pass in a parameter hash map containing {:foo some-value}
-- I'm still exploring whether/how to implement that in the V2 code I've built so far... but at the end of the day, it still produces a vector with ["SQL string" val1 val2 val3...]
so you're going to be at the mercy of Clojure's evaluation model at some point anyway.