pathom

:pathom: https://github.com/wilkerlucio/pathom/ & https://pathom3.wsscode.com & https://roamresearch.com/#/app/wsscode
wilkerlucio 2021-04-27T01:44:34.245Z

hello people, I just fixed the embed visualizations in the planner docs of Pathom 3, in case you checked recently and found it broken, its fixed and up now πŸ™‚ https://pathom3.wsscode.com/docs/planner. Also, these are more recent visualizations strait from a new thing that I’m calling Pathom Viz Embed, this is something you can use to embed Pathom Viz in web renderes, currently I’m using on the site and now in the Pathom 3 docs. Docs for the viz embed are not available now, but you can find full examples in the sources for Pathom 3 docs: https://github.com/wilkerlucio/pathom3-docs/blob/master/src/main/com/wsscode/pathom3/docs/components.cljs

πŸ’― 1
wilkerlucio 2021-04-27T04:24:21.246200Z

For anybody playing with Pathom 3 and batches, I just pushed some fixes at version fe29d9d3ec89a0321bf20b41a4f2d161f715b80d, if you had trouble, please try this version

henrik 2021-04-27T09:16:28.246800Z

Great @wilkerlucio, will check it out

wilkerlucio 2021-04-27T18:54:25.247700Z

thanks @henrik! Just out of curiosity (and in case you tested πŸ™‚, if you do your batch attempt, does it get faster or slower than non-batching?

henrik 2021-04-27T21:22:03.247900Z

It definitely seems slower I'm afraid. This probably comes down to the fact that when the resolver isn't batched, it is called 70 times for a particular query (each call resolves in about 0.4ms, ~28ms out of a total of 0.1s for the entire Pathom run). When batched, however, the resolver is called with 525 entries, making the overall Pathom query balloon to about 0.3s.

henrik 2021-04-27T21:23:34.248100Z

This is just a quick comparison, I can throw Criterium at it to get more reliable numbers if it'll help.

henrik 2021-04-27T21:40:26.248300Z

So,

{'(count inputs) 525
 '(count (distinct inputs)) 70}
Where inputs is the arguments that the batched resolver is called with.

wilkerlucio 2021-04-27T21:41:52.248500Z

gotcha, yeah, for fast resolvers like this is expected that batch isn’t going to be faster

wilkerlucio 2021-04-27T21:42:17.248700Z

batch will shine better in case of hard IO that can be batched (looking up some external entity, avoiding n+1)

wilkerlucio 2021-04-27T21:43:05.248900Z

but I dont get why you have so many more calls in batch mode, it could be related to the deoptimized graph, but still, if things are not failing, it should be the same number of calls

wilkerlucio 2021-04-27T21:43:57.249100Z

from the 525 entries, some of those share the same inputs? thats how it reduces to 70?

wilkerlucio 2021-04-27T23:41:03.249300Z

@henrik I just made a change that will deduplicate same inputs on batch, can you please try again on ca9ea82ef92f8113148b39b884639b2a9e25a29a?

πŸŽ‰ 1
henrik 2021-04-28T05:51:39.249600Z

Cool! Will do.

henrik 2021-04-28T05:56:34.249800Z

> from the 525 entries, some of those share the same inputs? thats how it reduces to 70? Exactly. I'm pulling a highly interconnected graph, so there's a lot of redundancy. Let's say I'm asking for entities A, B, and C. If they're all connected to entity D, it ends up that I get D three times. When not using batching, it seems that Pathom is really good at deduplicating this, so that it only asked for D once. When using batching, it seems like each individual time D was required (once for A, B and C, each, a total of 3 times), this was forwarded to the resolver, resulting in a lot of redundancy in the batched collection (in the example, 3 calls to get D instead of 1 call to get D, which is then reused). I'll check out the change and see if this is no longer the case.

henrik 2021-04-28T06:17:58.250200Z

@wilkerlucio I'm seeing no duplication when using batching now πŸ‘

πŸ™ 1
wilkerlucio 2021-04-27T04:27:32.246500Z

@henrik

πŸ‘ 1