core-async

2020-10-06T01:32:52.072400Z

using an iterator instead of a lazy-seq and doing a channel op for each element is less complex than using a lazy-seq IMHO, lazy-seq is a data abstraction and not conducive to doing IO or compute tasks

jumar 2020-10-06T06:30:47.072600Z

Sounds like you don't need core.async at all and instead just want a primitive for limiting max number of concurrent calls. As mentioned above, claypoole might be a useful thing to look at. I use something like this: https://github.com/jumarko/clojure-experiments/blob/master/src/clojure_experiments/concurrency.clj#L62-L74

(defn re-chunk [n xs]
  (lazy-seq
   (when-let [s (seq (take n xs))]
     (let [cb (chunk-buffer n)]
       (doseq [x s] (chunk-append cb x))
       (chunk-cons (chunk cb) (re-chunk n (drop n xs)))))))

(defn map-throttled
  "Like `map` but never realizes more than `max-n` elements ahead of the consumer of the return value.
  Useful for cases like an rate limited asynchronous HTTP API (e.g. StartQuery AWS CloudWatch Insights API).
  Uses `re-chunk`."
  [max-n f coll]
  (map f (re-chunk max-n coll)))
The re-chunk piece is there specifically to deal with "chunked" sequences and thus avoiding realization of whole 32 calls. In my case, the AWS api is async already, but you can easily pass future as f.