sql

All things SQL and JDBC...
fabrao 2020-05-07T00:38:23.417Z

Hello all, do you think the problem about charset in next.jdbc came from jdbc driver?

{:chamado "241740",
  :descricao
  "LOJA 39 - copia de seguranca do backup do servidor ********** falhou.",
  :categoria "Bay.Banco de Dados.Backup com Falha",
  :abertura "26/04/2020 08:53",
  :fechamento "02/05/2020 22:04",
  :local "39 **** FLORIANÃ"POLIS IGUATEMI",  <----}]

fabrao 2020-05-07T00:39:02.417300Z

is there any way to fix this?

2020-05-07T01:10:35.418500Z

it is very unlikely to be something with next.jdbc

2020-05-07T01:11:05.419100Z

but character encoding is tricky

2020-05-07T01:12:24.420700Z

it could even be due to a mismatch of your terminal and the jvm's default character encoding

fabrao 2020-05-07T01:26:52.421300Z

do you think I have to encode after result?

seancorfield 2020-05-07T01:30:42.422600Z

@fabrao You may need to specify :characterEncoding "utf8" in your db spec or ?characterEncoding=utf8" in your JDBC URL, depending on your database and how you are setting up your data source.

fabrao 2020-05-07T01:46:02.424200Z

well, it didn´t work

fabrao 2020-05-07T01:48:01.424700Z

maybe it caused by SQLServer Collate

seancorfield 2020-05-07T01:49:06.425500Z

I don't know if what I suggested would work with MS SQL Server. You'll need to read the relevant docs for that DB and figure it out.

seancorfield 2020-05-07T01:51:30.427700Z

It could be failing to handle the character encoding at any number of points along the way... It could be the data insertion point, it could be the data retrieval point, it could be how your JVM is configured on your system, it could be your terminal...

fabrao 2020-05-07T01:52:45.428200Z

Yes, I´m trying to figure out where is the problem.

fabrao 2020-05-07T01:53:29.428700Z

the same results goes to GraphQL result

fabrao 2020-05-07T01:54:00.429400Z

"local": "39 FLORIAN�POLIS IGUATEMI"

seancorfield 2020-05-07T01:54:10.429700Z

If you have some native SQL Server "workbench" you should use that to verify the data that is actually in the DB.

fabrao 2020-05-07T01:59:44.430200Z

in sql management studio is ok for this

maxp 2020-05-07T10:22:23.430900Z

Is there an example how to handle postgresql arrays in result set?

maxp 2020-05-07T10:22:34.431200Z

rspecially varchar[] and integer[]

souenzzo 2020-05-07T15:53:24.433600Z

How to pass a "collection" to next.jdbc@postgres? (jdbc/execute! conn ["SELECT * FROM app_todo WHERE id IN (?)" [1]]) something like this

seancorfield 2020-05-07T15:54:02.434100Z

@souenzzo No, @maxp was asking about arrays in result sets not parameters.

souenzzo 2020-05-07T15:56:43.435Z

so there is no way to do this? how should I "get many vals from many inputs" in next.jdbc?

seancorfield 2020-05-07T15:58:14.435700Z

Sure, the docs tell you how to deal with additional data types in general in result sets. There's just no specific example for PostgreSQL array types.

seancorfield 2020-05-07T15:59:04.436100Z

The Getting Started guide provides an outline: https://cljdoc.org/d/seancorfield/next.jdbc/1.0.424/doc/getting-started#working-with-additional-data-types

seancorfield 2020-05-07T16:00:12.437600Z

(and it links to deeper discussions and some examples -- but it would be nice to have specific examples for array types in the PostgreSQL Tips & Tricks section if someone is willing to write them up: I do not use PostgreSQL!)

seancorfield 2020-05-07T16:00:45.438100Z

@souenzzo Oh, were you asking your own question, separate from @maxp’s question?

👍 1
seancorfield 2020-05-07T16:01:21.438800Z

(if so, the answer is in the Getting Started section I pointed to since that covers both input parameter types and output result set column types)

seancorfield 2020-05-07T16:01:40.439100Z

I thought you were trying to answer his question...

souenzzo 2020-05-07T16:28:33.440400Z

@seancorfield I also don't know SQL enough to know where i'm searching it's about "input parameter type"?

seancorfield 2020-05-07T16:35:00.441900Z

@souenzzo The two use cases are: * ReadableColumn protocol (for converting database-specific types to Clojure values -- reading from a JDBC result set back into Clojure data * SettableParameter protocol (for converting Clojure values to database-specific types -- passing Clojure data as input parameters into JDBC operations Does that help?

souenzzo 2020-05-07T16:35:40.442800Z

SettableParameter is the thing. i will search. tnks 🙂

seancorfield 2020-05-07T16:36:33.443600Z

@maxp is asking about the first case -- turning PostgreSQL array column types into Clojure vectors; I think you're asking about the second case -- passing Clojure collections into JDBC (presumably as PostgreSQL arrays)?

seancorfield 2020-05-07T16:39:26.445700Z

If you extend SettableParameter to Clojure vectors, the = ANY(?) trick will stop working I suspect. But you may not need to extend SettableParameter for your use case in PostgreSQL: simply using ANY(?) instead of just ? may be sufficient for the PostgreSQL JDBC driver to accept a vector type and convert it to an array type for the column...

seancorfield 2020-05-07T16:40:14.446600Z

e.g., ["INSERT INTO my_table (id, array_data) VALUES (?, ANY(?))" 1, [2 3 4]]

seancorfield 2020-05-07T16:40:55.447500Z

I've no idea whether that would work (like I say, I don't use PostgreSQL at all) but given how the IN/`= ANY(?)` trick works, I suspect it might.

seancorfield 2020-05-07T17:00:59.448700Z

@souenzzo WHERE id = ANY(?) should work -- which is the example in the docs.

seancorfield 2020-05-07T17:01:16.449100Z

Note: = rather than IN.

seancorfield 2020-05-07T17:01:49.449700Z

I thought you were asking about passing a collection into a single column for an insert or similar.

seancorfield 2020-05-07T17:03:00.449800Z

I'm surprised the solution needs to be that complicated...

seancorfield 2020-05-07T17:03:52.450Z

...oh, because you are deliberately deferring the expansion of the array data from JDBC? What's the reasoning behind that @serioga?

seancorfield 2020-05-07T17:05:03.450200Z

It seems like any operation you perform on it will expand the whole array anyway, and it seems like you would run the risk of the call to .getArray happening after the connection closes with your code, which would fail in most cases I think?

seancorfield 2020-05-07T17:06:04.450400Z

I think the minimum version would be something like:

(extend-protocol rs/ReadableColumn
  Array
  (read-column-by-label [^Array v _] (vec (.getArray v)))
  (read-column-by-index [^Array v _ _] (vec (.getArray v))))

serioga 2020-05-07T17:06:43.450600Z

the reason is to avoid expensive (.getArray a) if I don't access data in result column.

souenzzo 2020-05-07T17:07:49.451500Z

["SELECT *
  FROM app_todo
  WHERE author = ANY(?)"
 (int-array [1 3])])
This works! But without int-array I get
Execution error (PSQLException) at org.postgresql.jdbc.PgPreparedStatement/setObject (PgPreparedStatement.java:978).
Can't infer the SQL type to use for an instance of clojure.lang.PersistentVector. Use setObject() with an explicit Types value to specify the type to use.

serioga 2020-05-07T17:08:17.451600Z

> you would run the risk of the call to .getArray happening after the connection closes let me check 🙂

seancorfield 2020-05-07T17:08:39.452200Z

Yes, that's expected.

seancorfield 2020-05-07T17:09:01.452500Z

(and that's exactly what the documentation says to do)

seancorfield 2020-05-07T17:10:22.453600Z

> What does this mean for your use of next.jdbc? In plan, execute!, and execute-one!, you can use col = ANY(?) in the SQL string and a single primitive array parameter, such as (int-array [1 2 3 4]). That means that in next.jdbc.sql's functions that take a where clause (`find-by-keys`, update!, and delete!) you can specify ["col = ANY(?)" (int-array data)] for what would be a col IN (?,?,?,,,?) where clause for other databases and require multiple values.

serioga 2020-05-07T17:10:40.453700Z

usually I always work with result after connection close, so I think data is kept in java.sql.Array instance internally...

seancorfield 2020-05-07T17:13:02.453900Z

If you use plan and don't reference the array column, it wouldn't actually be read from the result set anyway. And if you use execute!/`execute-one!` the assumption is that you should get a fully-realized Clojure datastructure that is completely detached from JDBC (so don't select array columns if you don't need them 🙂 ).

seancorfield 2020-05-07T17:13:32.454100Z

It feels like you're fighting against the simple path here... but, hey, if it works for your use case.

serioga 2020-05-07T17:20:44.454300Z

Yes, I just try to avoid to convert all array values if code does not work with them. I feel sick about wasting CPU time for useless conversion 🙂

serioga 2020-05-07T17:22:27.454500Z

> If you use plan and don't reference the array column, it wouldn't actually be read from the result set anyway. @seancorfield is it so if I fetch row but ignore column?

seancorfield 2020-05-07T17:32:26.454700Z

ReadableColumn only applies if plan's reduction references that specific column by name (actually by label)

seancorfield 2020-05-07T17:33:13.454900Z

In execute!/`execute-one!`, every column mentioned in the result set, i.e., select'd in the SQL, is read (by index).

seancorfield 2020-05-07T17:35:19.455100Z

Both of those build a full hash map from each row, with all columns. plan doesn't even build a hash map if you only reference columns directly by label and don't try any map-like operations on the row itself. That's why plan is so efficient.