clojure

New to Clojure? Try the #beginners channel. Official docs: https://clojure.org/ Searchable message archives: https://clojurians-log.clojureverse.org/
jmckitrick 2020-11-09T00:37:55.245Z

I’m trying to get CI/CD on GitLab, and I keep getting this error at the end of the log:

jmckitrick 2020-11-09T00:37:59.245300Z

Syntax error compiling at (/tmp/form-init5582582439791414361.clj:1:73).
688could not find a non empty configuration file to load. looked in the classpath (as a "resource") and on a file system via "conf" system property

Louis Kottmann 2020-11-09T14:52:24.267700Z

the best way to debug pipelines in GitlabCI, is to put a sleep 1d right before the failing line, and then connect to the runner to debug in a close loop

Louis Kottmann 2020-11-09T14:52:38.267900Z

of course that only works if you have your own runner and not use the public ones

jmckitrick 2020-11-11T01:09:27.322200Z

Ah, ok.

jmckitrick 2020-11-09T00:40:01.247100Z

I’ve added the test-config.edn file used by luminus apps, and the pipeline script for CI/CD is running the tests with with-profile test so I think this means the tests will be run using the test profile, which uses the .edn file above to configure the database.

jmckitrick 2020-11-09T00:40:07.247300Z

What could I be missing?

2020-11-09T01:28:42.248700Z

Have you tried running it locally the same way you thing it is running on the ci server?

jmckitrick 2020-11-09T01:32:01.249Z

Yes, with key word being ‘think’ lol.

jmckitrick 2020-11-09T01:32:19.249400Z

I just found the error in the log is from cprop, so I’m going to dig into that library

jmckitrick 2020-11-09T02:47:42.250200Z

So I dumped the config when running in the test env, and it’s full of settings…

jmckitrick 2020-11-09T02:47:57.250500Z

But this last part makes me wonder:

jmckitrick 2020-11-09T02:48:20.250700Z

Syntax error compiling at (/tmp/form-init408014767393427046.clj:1:72).

jmckitrick 2020-11-09T02:48:42.251100Z

I can’t seem to track that down, since it’s it’s a CI/CD container

emccue 2020-11-09T02:52:32.251500Z

Whats the easiest way to encode/decode an Instant with transit?

emccue 2020-11-09T02:52:54.252Z

I am sending a basically a database record to other nodes

2020-11-09T02:53:38.252800Z

it should "just work", but of course you are free to make your own writer for it if you prefer to translate it

emccue 2020-11-09T02:53:56.253200Z

Execution error at com.cognitect.transit.impl.AbstractEmitter/marshal (AbstractEmitter.java:194).
Not supported: class java.time.Instant

emccue 2020-11-09T02:54:11.253600Z

if there is a standard-ish extension i would use that

2020-11-09T02:54:38.254100Z

oh, I thought it was there out of the box

emccue 2020-11-09T02:55:00.254500Z

maybe i am using an old version

emccue 2020-11-09T02:55:43.254800Z

[io.pedestal/pedestal.service "0.5.8"]
     [com.cognitect/transit-clj "0.8.313"]
       [com.cognitect/transit-java "0.8.337"]
         [javax.xml.bind/jaxb-api "2.3.0"]
         [org.msgpack/msgpack "0.6.12"]
           [com.googlecode.json-simple/json-simple "1.1.1" :exclusions [[junit]]]
           [org.javassist/javassist "3.18.1-GA"]

emccue 2020-11-09T02:55:54.255100Z

I actually seem to be getting transit via pedestal

emccue 2020-11-09T02:55:57.255300Z

which is not ideal

emccue 2020-11-09T03:01:24.255600Z

yeah does not seem to be supported even on a newer version

emccue 2020-11-09T03:03:11.255900Z

i could always give up and just use serialization

emccue 2020-11-09T03:03:26.256200Z

i think that sends me straight the hell, don't collect $200

emccue 2020-11-09T03:03:29.256400Z

but...meh

emccue 2020-11-09T03:05:06.256800Z

https://github.com/ptaoussanis/nippy

emccue 2020-11-09T03:05:12.257100Z

whats the worst that could happen

2020-11-09T03:05:46.257500Z

it's not hard to add config for new datatypes to transit

2020-11-09T03:06:00.257900Z

I'm pulling up an old repo where I do that to put together an example

emccue 2020-11-09T03:10:19.258200Z

Execution error (ExceptionInfo) at taoensso.nippy/throw-unfreezable (nippy.clj:1001).
Unfreezable type: class next.jdbc.result_set$navize_row$fn__4512

emccue 2020-11-09T03:10:26.258500Z

yeah i actually get an interesting error

emccue 2020-11-09T03:10:37.258800Z

#:page_message{:id 7, :page_id_from 1, :page_id_to 1, :contents {:hello world}, :reactions [], :created_at #object[java.time.Instant 0x27c93821 2020-11-09T03:10:02.791910Z], :updated_at #object[java.time.Instant 0xe0fe300 2020-11-09T03:10:02.791910Z]}
clojure.lang.PersistentArrayMap

emccue 2020-11-09T03:11:21.259400Z

considering nothing in the data returned has anything but basic datatypes

emccue 2020-11-09T03:16:35.259900Z

;; ----------------------------------------------------------------------------
(defn- ->transit [data]
  (let [out (ByteArrayOutputStream. 4096)]
    (transit/write
      (transit/writer out :json
                      {:handlers {Instant
                                  (transit/write-handler
                                    "java.time.Instant"
                                    #(.toEpochMilli %))}})
      data)
    (.toString out)))

;; ----------------------------------------------------------------------------
(defn- <-transit [data]
  (let [in (ByteArrayInputStream. (.getBytes data StandardCharsets/UTF_8))]
    (transit/read
      (transit/reader in :json
                      {:handlers {"java.time.Instant"
                                  (transit/read-handler
                                    #(Instant/ofEpochMilli %))}}))))

emccue 2020-11-09T03:16:46.260200Z

but yeah, got it working

2020-11-09T03:19:27.260400Z

ahh, that would be it

valerauko 2020-11-09T07:16:53.261Z

The docs say >! "will park if no buffer space is available". What does "park" mean here?

valerauko 2020-11-09T07:19:07.262100Z

Reason I'm asking is I'm getting Assert failed (< (.size puts) impl/MAX-QUEUE-SIZE) errors when producing a ton of input quickly

valerauko 2020-11-09T07:22:55.262500Z

Is that the backlog of messages waiting to be put on the channel?

valerauko 2020-11-09T09:09:33.262900Z

Never mind, found the explanation in go's docstring

AJ Jaro 2020-11-09T13:55:07.266200Z

Has anyone used https://github.com/mcohen01/amazonica for invoking Lambda functions? I can call invoke, but the payload in the response is in a HeapByteBuffer and I’m not sure how to translate that into something usable

Asko Nõmm 2020-11-09T14:24:33.266600Z

Just a guess, but have you tried slurp?

AJ Jaro 2020-11-09T14:29:31.266800Z

@asko That’s a good question. I got the following error using slurp > Cannot open <#object[java.nio.HeapByteBuffer 0x2f40a23 “java.nio.HeapByteBuffer[pos=0 lim=281 cap=281]“]> as an InputStream.

Asko Nõmm 2020-11-09T14:30:54.267Z

Maybe you can use Java’s own java.nio.ByteBuffer

AJ Jaro 2020-11-09T14:40:07.267500Z

@asko That’ll help me parse the buffer into a collection of longs. I don’t know what to do with this though. I’m expecting some JSON

rgm 2020-11-09T15:44:24.269600Z

Has anyone much experience with print-to-pdf using https://github.com/tatut/clj-chrome-devtools/blob/c294a69084c5e3e655d366bc34a85d232f34219a/src/clj_chrome_devtools/commands/page.clj#L1186-L1188 … it’s tantalizingly close to working for me, but I’m getting PDFs that are showing as blank pages.

rgm 2020-11-09T15:46:16.269900Z

I think the data’s there (a 1-pg PDF is coming in 200-400kb, a 2-pg is about 2x that). And I’m sure the headless chrome install is working: I can make pdfs with clojure.java.shell/sh.

rgm 2020-11-09T15:49:33.270100Z

For some background on how I’m making files https://gist.github.com/rgm/b3fb7c231ca41aad871eaf6115d7699f

rgm 2020-11-09T15:53:44.270300Z

I get different data sizes for different sites, and get 2 pages when I specify a 1-2 page range, so my working hypothesis is that the data is there, but something’s going wrong with a crop someplace.

AJ Jaro 2020-11-09T17:51:02.270600Z

@asko Here’s the code I ended up using to parse out the payload.

(defn byte-array-&gt;string [byte-array]
  (.encodeToString (Base64/getEncoder) byte-array))

(defn decode-byte-buffer [payload]
  (-&gt;&gt; payload
       (.decode (Base64/getDecoder))
       (String.)
       (json/parse-string)
       (walk/keywordize-keys)))

(defn parse-payload [invoke-result]
  (-&gt;&gt; (:payload invoke-result)
       (.array)
       (byte-array-&gt;string)
       (decode-byte-buffer)))

Hankstenberg 2020-11-09T20:51:49.273100Z

What was the name of the library that offers a novel, concise performant way to do map traversal? It's on the tip of my tongue, but I can't remember/find it.

borkdude 2020-11-09T20:52:18.273500Z

@marcus.poparcus Do you mean meander maybe?

👍 1
borkdude 2020-11-09T20:52:26.273700Z

or specter

Hankstenberg 2020-11-09T20:54:47.274800Z

@borkdude Ah yes, thank you! It was specter, but I remember that I wanted to check out meander too!

Hankstenberg 2020-11-09T20:55:03.275Z

Do you have a preference?

borkdude 2020-11-09T20:55:41.275600Z

I don't use either actively at the moment, besides experimenting with this: https://github.com/borkdude/grasp#pattern-matching

2020-11-09T22:57:32.276100Z

I think specter has the best performance

jimmy 2020-11-09T23:06:29.282800Z

Specter and meander are aimed at fairly different use cases and have fairly different performance characteristics. There are definitely cases where I'm sure specter beats meander in performance. But others where they can be more or less the same. We do consider performance a goal. And if you find something that is slower than you would like, always let us know in #meander or on github :) My recommendation is to try out all the things the clojure community has to offer. There are so many great libraries people have made for lots of different purposes. Try them out, learn from them, and see what is right for you. Or make your own and show us all your way of doing things :)

schmee 2020-11-09T23:06:57.283400Z

was just about to say, I find Specter and Meander complementary to each other and wouldn’t want to be without either of them!

jimmy 2020-11-09T23:09:13.283600Z

Just to add something here. One of the things that often makes meander slower is that we care about safety and so do lots of predicate checks to ensure the pattern you have us is really why the data is. We do try to eliminate as much as when can and plan on having a way to opt out of that for performance sake.