malli

https://github.com/metosin/malli :malli:
2020-10-27T16:43:45.342Z

just a shout-out that i'm biting the bullet and migrating our project with 4k+ LoC full of specs to malli and i'm totally loving it -- the fact that it's just plain symbols + data makes thing so much easier to reason about. thanks a lot for this monumental effort of making malli work!

🎉 3
2020-10-27T16:48:50.344600Z

just a sanity check, but it seems to me that using "plain data" for all the schemas (i.e. not wrapping them in m/schema), and then at a later point use a compile-time m/validator and m/explainer is a decent approach, right ? i've just whipped up this macro to perform assertions, which allocates the explainer at compile time, but does the actual validations at runtime

(defmacro assert
  [s v]
  (when *assert*
    (let [explainer# `(m/explainer ~s)]
      `(when-let [ed# (~explainer# ~v)]
         (throw (ex-info "Validation failed"
                         (merge {::v ~v}
                                (me/humanize ed#))))))))

borkdude 2020-10-27T16:51:27.346400Z

@lmergen I'm curious about your experience report. There is a thread on Reddit about exactly this. Feel free to comment there if you want.

rschmukler 2020-10-27T16:52:53.348300Z

@lmergen that's exactly what I do. Most (all?) of the malli public API will coerce to a compiled schema, which lets you keep your schema definitions nice and focused. Then compile-time m/validator and m/explainer. Also there are the short hands of m/explain (and m/validate) which create a temporary explainer for you. This isn't performant, but, for use-cases like your macro, it saves you having to create it yourself.

rschmukler 2020-10-27T16:53:38.349300Z

I suppose a slight difference is that your macro will create the explainer at compile time, so up to you if you want that difference!

2020-10-27T16:53:44.349500Z

@borkdude i started that thread 🙂

borkdude 2020-10-27T16:54:06.349700Z

aaah! :)

2020-10-27T16:57:09.352200Z

but yes so far so good -- of course there's a "second system" benefit here, because i have a much better understanding of the model and taking the opportunity to refactor a few things. one of the big changes is that rather than one humongous specs.cljc, i'm now creating many more namespaces, and each of the malli schemas live within those namespaces. the reason for this is that spec works with (namespaced) keywords, so you can get away with putting them in a single namespace, where malli seems to map more natural if you split things up in different namespaces.

👍 1
rschmukler 2020-10-27T17:04:54.354100Z

@lmergen that was exactly my experience as well! One benefit that you get is if you make use of def or defn to return those data schemas, you get the added benefit of having the compiler help check that those things still exist. In spec you can delete / mistype a keyword and get bit, but here, you can get the compiler (or clj-kondo) to help you