core-async

penryu 2019-12-04T09:36:21.244Z

Has anyone done any profiling on which part of core.async contributes most to the load time?

penryu 2019-12-04T09:37:39.244300Z

It's not prohibitive, but it was enough that I refactored some code to use future instead of thread to avoid it.

2019-12-04T11:34:15.244500Z

What was your thread doing and how many did you create?

penryu 2019-12-04T11:36:20.244700Z

The delay happens at require time. Just adding the (:require...) clause to the ns declaration adds about a second of time.

penryu 2019-12-04T11:40:33.244900Z

I'm sure there's plenty of stuff in async that makes that delay reasonable (if it's needed). I was just wondering if anyone had looked into what causes the delay.

dpsutton 2019-12-04T12:46:10.245500Z

I think there’s a thread pool created on demand

alexmiller 2019-12-04T13:12:46.246300Z

It’s the macro stuff, has nothing to do with the threads

alexmiller 2019-12-04T13:14:22.248700Z

Ghadi actually has a speculative refactor that loads the go loop stuff on demand, avoiding the delay if you’re not using go’s but I’m not sure if that’s common enough to be worth the trouble

✅ 1
2019-12-04T15:44:58.248900Z

So is that considered a bug? Rich even said in his Inside Transducers + more.async talk that the backpressure semantics change with expanding transducers like cat, but the buffer size is still bounded. With the current behavior it can grow indefinitely.