@borkdude there’s no way to also read and get information about comments ;
in edamame, is there?
@mkvlr can you explain why you would want this?
@borkdude literate programming in which the comments get parsed as markdown as explanatory between the code
rewrite-clj does return comment nodes but I like edamame’s api better
@mkvlr I guess we could do something here:
https://github.com/borkdude/edamame/blob/9823a7af0edab2cf8596e345b6ea50979f1e9149/src/edamame/impl/parser.cljc#L538
but the question is:
what would you return when parsing (defn foo [] ;; nice comment \n 1)
Right now this returns (defn foo [] 1)
if you stick to a strict format, e.g. ;;
only at the start of a line, you can probably parse this manually
this would have to be an opt-in thing anyway, right? Maybe the caller could decide what to turn this into? but not sure
but what would you want to return? this can mess up the sexpr pretty badly probably
did you use rewrite-clj for deps.edn rewriting?
yes
ah ok, I had assumed this must be possible within the borkdude ecosystem 😼
the borkdude ecosystem is not a closed world ;)
but it’s pretty complete
clj-kondo also uses rewrite-clj
@lee knows what to use nowadays (I lost track of all his rewrite-clj repos)
thinking of wanting to get (comment ";nice comment")
or (line-comment ";nice comment")
back maybe
@mkvlr I can see it being useful to return comments when you are on the top-level but within balanced { .. }
for example, I can see this going wrong, like {:a 1 ;; foo \n :b 2}
=> incorrect map
I started in lread/rewrite-cljc-playground and am migrating that work into clj-commons/rewrite-clj v1 branch which will be merged to clj-commons/rewrite-clj main.
@borkdude ah right, guess top level only would be fine for my use case but unsure if it’s worth it then…
@mkvlr if top-level, I think this is pretty easy to parse manually, with (str/starts-with? (str/trim ...) ";")
@borkdude true, I guess rewrite-clj is also fine
@borkdude I am using sci in datalevin command line tool, it works for most part, however, there are some datalog queries that work in JVM do not work in sci
No implementation of method: :-find-vars of protocol: #’datalevin.parser/IFindVars found for class: datalevin.parser.Constant
Is there anything special needed for loading the namespaces?
Hey! I'm not sure what this means without having some kind of repro
You can point me to your repo and I can have a look if you can make some kind of failing test
and instruct me how to run that
this is what i currently doing, i expect it won’t work as I am not using sci namespaces, but I also tried using SciNamespace and SciVar, still doesn’t work
i saw that your datascript feature is using SciNamspace and copy-var, so I tried to do that, but still no luck
> I can have a look if you can make some kind of failing test
sure
Are you trying to expose a protocol to the sci scripts?
no, i am just trying to run functions in sci
What is the example script that manifests this error?
(or REPL input for that matter)
the kind of datalog query that involve (pull …)
normal queries work fine
I need a complete example / test so I can have a look. If you can provide me with details instructions how to clone and run that, I'll take a look tomorrow
I have already tried your native binary locally, that worked for me
Very cool stuff btw
sure thanks, i will create a test case
is pull a macro perhaps
yeah, it works for most part, only some queries that don’t work
yes, a macro is involved i think
ah, macros need special attention
that’s what i suspected
can you show me the macro expansion of the failing pull query?
#?(:clj (defmacro deftrecord “Augment all datalevin.parser/ records with default implementation of ITraversable” [tagname fields & rest] (let [f (gensym “f”) pred (gensym “pred”) acc (gensym “acc”)] `(defrecord ~tagname ~fields ITraversable (~’-postwalk [this# ~f] (let [new# (new ~tagname ~@(map #(list ‘datalevin.parser/postwalk % f) fields))] (if-let [meta# (meta this#)] (with-meta new# meta#) new#))) (~’-collect [_# ~pred ~acc] ;; [x y z] -> (collect pred z (collect pred y (collect pred x acc))) ~(reduce #(list ‘datalevin.parser/collect pred %2 %1) acc fields)) (~’-collect-vars [_# ~acc] ;; [x y z] -> (collect-vars-acc (collect-vars-acc (collect-vars-acc acc x) y) z) ~(reduce #(list ‘datalevin.parser/collect-vars-acc %1 %2) acc fields)) Traversable (~’-traversable? [_#] true) ~@rest))))
`#?(:clj (defmacro deftrecord "Augment all datalevin.parser/ records with default implementation of ITraversable" [tagname fields & rest] (let [f (gensym "f") pred (gensym "pred") acc (gensym "acc")] `(defrecord ~tagname ~fields ITraversable (~'-postwalk [this# ~f] (let [new# (new ~tagname ~@(map #(list 'datalevin.parser/postwalk % f) fields))] (if-let [meta# (meta this#)] (with-meta new# meta#) new#))) (~'-collect [_# ~pred ~acc] ;; [x y z] -> (collect pred z (collect pred y (collect pred x acc))) ~(reduce #(list 'datalevin.parser/collect pred %2 %1) acc fields)) (~'-collect-vars [_# ~acc] ;; [x y z] -> (collect-vars-acc (collect-vars-acc (collect-vars-acc acc x) y) z) ~(reduce #(list 'datalevin.parser/collect-vars-acc %1 %2) acc fields))` Traversable (~’-traversable? [_#] true) ~@rest))))
sorry, this is messed up
that’s the macro responsible to flesh out all the defrecord in parser.clj,
I see yes. So:
(deftrecord Aggregate [fn args]
IFindVars (-find-vars [_] (-find-vars (last args))))
expands in some defrecord
call which needs the IFindVars
protocol to be around. This is where it gets a little hacky, but you can make this work. In sci, due to GraalVM limitations, protocols are implemented as multimethods.
E.g. this is how Datafiable
is added to the sci config of babashka:
https://github.com/babashka/babashka/blob/bbf144fbce66c6986253119eb81392a440cb17c6/src/babashka/impl/protocols.clj#L33ok i see
so i need to do the same for IFindVars?
if you want that macro to work, yes
$ bb -e "(require '[clojure.core.protocols :as p] '[clojure.datafy :as d]) (defrecord Foo [] p/Datafiable (datafy [_] :foo)) (d/datafy (->Foo))"
:foo
It magically works, but it works around the byte code restriction by using multimethods. This is a little bit undocumented right now and the implementation might change in the future
So why do users need to use (deftrecord ...)
in their scripts/REPL?
not users, but the parser does that
at REPL-time?
right, the query has to be parsed
and then it defines these record types during parsing?
that’s an interesting question, maybe they should not
because those record type should be compiled
but somehow they didn’t
yeah, it seems a little odd. a parser should just take some data in and produce some other data probably
are you aware of https://github.com/lambdaforge/datalog-parser?
it’s the same as the datascript one, they just extract that out i believe
ok
ok, let me figure it out, it looks like a compilation problem
ok, that would simplify stuff a lot
i will try the macro fix, thanks for point me at that
I'm off to bed now, will read again tomorrow. Exciting stuff, looking forward to using datalevin from babashka scripts :)
thanks, have a good sleep!