pedestal

v3ga 2019-04-23T02:33:10.044500Z

What’s a good way to visualize interceptors in general? I’m looking for a common analogy that even a non-tech person may understand.

2019-04-23T13:03:55.045300Z

@decim, perhaps this document will clear things up? http://pedestal.io/reference/interceptors

pfeodrippe 2019-04-23T13:59:48.048100Z

Hi, is there any example using InputStream at response body? I want to support streaming from it and I'm using a PipedOutputStream connected to a PipedInputStream, but it's sending everything only after everything processed. I'm using future to handle what I have to process, some ideas?

souenzzo 2019-04-23T14:14:39.048500Z

{:body (io/input-stream (.getBytes "foo")) :status 200} works.

pfeodrippe 2019-04-23T14:17:58.050400Z

Yes, but I want to write to the client while processing, not only when it ends, so the client does not have to wait all the processing

v3ga 2019-04-23T14:48:33.051100Z

@ddeaguiar yeah the guide is fine, I was just trying to think of other ways to describe it so even someone thats not a programmer can understand, for my notes

2019-04-23T15:41:15.051400Z

@pfeodrippe can you elaborate on your use case?

pfeodrippe 2019-04-23T16:12:27.051800Z

@ddeaguiar I'll write a simplified code

(def streamed-interceptor
  (interceptor
   {:name ::interceptor
    :enter
    (fn [context]
      (let [pout (PipedOutputStream.)
            pin (PipedInputStream. pout)]
        (future
          (with-open [pout pout]
            (doseq [_ (range 10)]
              (Thread/sleep 1000)
              (.write pout
                      (.getBytes "Eita\n")
                      0 (count "Eita\n"))
              (.flush pout))))
        (assoc context
               :response {:status 200
                          :body pin})))}))

pfeodrippe 2019-04-23T16:14:56.053100Z

Based on this code, I'd like to see a new line with "Eita" each second

pfeodrippe 2019-04-23T16:17:44.054900Z

I've tested with a lot of data and it was streaming each 32768 bytes... I'm now looking for a way to reduce this buffer size for the streaming response, any idea @ddeaguiar?

pfeodrippe 2019-04-24T16:51:27.060200Z

Great o/

2019-04-23T16:43:26.055500Z

I've seen one metaphor: passing through a succession of gates or checkpoints, maybe for a flight.

1
2019-04-23T16:43:36.055800Z

some of the tasks are one-sided, others double-sided.

2019-04-23T16:45:00.057400Z

eg. you check-in (`:enter` only), drop your bag (`:enter` of a double-sided one), go through security (`:enter` only), fly (the real handler), clear customs (`:leave` only), get your bag (`:leave` of the bag-handling interceptor) and done.

2019-04-23T16:45:43.058200Z

actually, an even better metaphor there: the check-in process gives you a token (your boarding pass) that you show to several successive checkpoints (bag drop, security, at the gate)

2019-04-23T17:00:00.058400Z

@pfeodrippe to clarify, why do you need an input stream?

donaldball 2019-04-23T17:49:52.058600Z

The docs say you can use a core.async channel for your response body to get the effect you want: http://pedestal.io/reference/streaming#_using_a_core_async_channel_as_the_response

donaldball 2019-04-23T17:50:14.058800Z

I have never used this mechanism though.

pfeodrippe 2019-04-23T19:53:41.059Z

Sorry, just saw it now, I have a function where I'm making requests to other endpoints and I want to notify the client right after the processing of 1 endpoint (of N)

pfeodrippe 2019-04-23T19:54:16.059200Z

So the client does not have to wait until the end of the request to get all the responses

pfeodrippe 2019-04-23T19:54:36.059400Z

If it's not clear, I can try to explain again 😃

pfeodrippe 2019-04-23T20:02:35.059600Z

Yes, I have saw this, but we prefer more controlled threads instead of go blocks