Good morning!
@simongray wouldn't a zipper require the data all be in memory?
Good morning! Did you see I made a macro? 😃
macrotastic
👋
månmån
@ordnungswidrig I don't think zippers as a concept would necessitate keeping the entire data structure in memory. They are basically pointers inside a tree data structure with some options for local navigation. If you make a custom tree data structure that doesn't realise the contents of its branches until read time, then it won't keep everything in memory.
And here I'm talking about clojure.zip - I'm sure you can make a zipper that's more efficient than that implementation too.
But obviously it all depends on the data structure. Not sure how lazy something like Clojure's hashmap would be in practice.
If you're navigating a huge json file from a disk you will need some custom data structure magic.
But the search algorithm itself would be easy to write: just check sibling nodes and follow paths down where you need to. This can be parallelised too for searches, but I don't think it would work for transformations - at least I wouldn't be sure how to join the edits from a bunch of zippers.
"Lazy stream zippers" sounds like a 60s progressive rock band ❤️
moin moin
Morning
@simongray Don’t know too much about zippers, but some/most underlying json parsers eagerly parse the json into tokens. So while you could avoid creating all the maps/seqs, you still need to fit the whole json string/tokens in memory. What would be really cool would be a streaming/lazy tokenizer (which I believe is what @ordnungswidrig found yesterday)
👋 ohai
ohai!
So imagine I had a 300GB json and I basically a wanted (get-in parsed-jason [0 :name])
, the trick is to get the name of the first object without parsing the whole json string.
sure
that’s then data structure magic I was referring too 😉
With that, comes the question of validating the json as well. You cannot validate it without reading the whole thing.
sure
(which may or may not be a problem)
but I guess you can interpret every level of the data structure you need as as {key_1 <pointer to val_1>, … key_n <pointer to val_n>}
. Then you need a zipper function that can realise a pointer as a piece of data. I guess the pointer could just be the linecount/charcount boundary of each val.
The point is just, with the right data structure, implementing a zipper for it is pretty straightforward and zippers can be paused, resumed, rewinded, and really made to go in any direction.
So basically you read through the entire contents of a json object (as text), registering the keys and the boundaries of the vals (your pointers). Then you can zip into any one of those vals in separate threads if you like and simply repeat the algorithm for the boundary contained by the pointer, e.g. “line 4/char 7 to line 6/char 12”.
I realise that I may just be cargoculting zippers since I really like using them 😛
to me the fact that you write pausable tree navigation and transformation algorithms using such a simple tool is quite magic
Yes, I think I see that, but I was more thinking about the problem of an infinite json-stream (A json-based Turing machine?) or a json-stream which was to big to hold in memory (or which stream was so slow that it didn’t make sense to read the whole thing to get the first element , (https://tools.ietf.org/html/rfc2549)))
i see
just thinking out loud here.... could shelling out to jq
help here? it might do something clever (I don't know, but I guess easy to test)
jq does allow the --stream
option
<https://stedolan.github.io/jq/manual/#Streaming>
This is why I think regex state machines are how this might be implemented. E.g. query like `give me the “order” object when any of the “orderitem/product/name” values contains “gizmo” could be “compiled” into a statemachine which collects order date until you can rule out the the pattern matches. This all happens on an event stream of json tokens:
:map
:key "orders"
:list
:map
:key "id"
:string "123"
:key "orderitems
:list
:map
:key "name"
:string "Rumbling gizmo"
:endmap
:endlist
:endmap
:endlist
:endmap
never used it myself 'tho
@dharrigan that sounds like an interesting option
but only json, not edn 😛
🙂
PR to jet
welcome for EDN :P
Another idea: don't store your data in JSON but XML and use tools that already worked in the 00s ;)
Morning
@slipset BTW I starred this repo the other day: https://github.com/pangloss/fermor Haven’t looked that deep into it, but looks to me like it’s using some of the same buzzwords.
seems to use a Java dependency, though-
Thanks!
Morning!
@simongray I like this code comment from the examples:
;; This version is a very direct port of the above query and in a fermor system
;; would never pass code review. It has all of the guts of the query hanging
;; out. Instead we can trivially create a domain model and work at a much higher
;; level of abstraction.
hmmm one could also implement a jackson parser for EDN. 🙂 This should unlock JsonPath on EDN I guess. :thinking_face: Not sure if that’s a nifty idea though.
Maybe one can already use JsonPath on transit? :thinking_face:
morning
@borkdude you mean translating jsonpath to a jsonpath expression that would match on transit?
Sounds super hard because of the statefulness of transit 🙂
yes, hard
Playing with writing a macro is quite amazing. I am actually manipulating Clojure code like the data it is. I’ve been telling people about that Clojure is homiconic and sort of understood what it means, but now I realize what it actually means. 😃