does anyone know whether backquote and ~@ ever produces lazy sequences? or are they always eager sequences?
user=> (first `(~@(range)))
0
interesting, good example.
so at least ~@ respects the laziness of what it is splicing in.
yep
if you used [~@(range)]
it would not terminate
you can add a quote in front of syntax-quote to see how it works btw
user=> '`(~@(range))
(clojure.core/seq (clojure.core/concat (range)))
user=> '`[~@(range)]
(clojure.core/apply clojure.core/vector (clojure.core/seq (clojure.core/concat (range))))
@bronsa It seems tools.reader wraps the thing in another sequence
call:
user=> (tr/read-string "'`(~@(range))")
(quote (clojure.core/sequence (clojure.core/seq (clojure.core/concat (range)))))
why is this?because clojure is broken and tools.reader isn't
user=> `(~@())
nil
tr will return ()
for that
nice. fwiw babashka has adopted this from t.r.
there must be a ticket somewhere in jira
Has anyone already discussed whether binding
should be allowed to return a lazy sequence? To me this sounds dangerous. I'm going to refactor my own code with a macro around binding
called something like my-minding
which un-lazifies its return value. Otherwise the sequence returned when realised will have code evaluated in a dynamic extend AFTER the extent of the dynamic binding has ended.
that's not perfect because the return value of binding
might be an Record which has a field which is lazy, and that lazy sequence will escape.
It is a semi-common example raised where someone is surprised by this behavior, and then either removes the use of a dynamic variable, and thus binding, or ensures they realize all of the lazy sequence creation inside of the binding that they wish, in the run-time scope of the binding
I doubt there is any plan from Clojure's core team to attempt to catch this situation for you.
I think it can't be done correctly.
you can't really prevent binding from returning a lazy-seq
an application can do it for itself, in that the application knows which scenarios are possible.
It would be possible to prevent binding
from returning a lazy sequence, but it would not be possible to prevent it from returning an object which somewhere within its structure contains a lazy sequence.
no, not even the first case would be possible in the general sense
i.e. the macro could simply wrap doall
around its body.
but that would only catch it if the return value is explicitly a lazy sequence.
@bronsa, can you explain what you mean? which case am I overlooking?
right ok, you could force the toplevel lazy seq, not prevent it
@jimka.issy we're just talking cross each other, I thought by not allow
you meant disallow
ahhh.. ok, but anyway, you can't prevent a sequence of sequence of sequence of sequence of Records which a field which is a lazy sequence.
going down that route you could walk the return value forcing all lazy seqs
but not a good idea :)
Having a compile time check that caused the Clojure compiler to reject code is the situation that I think bronsa thought Jim meant. That is computationally undecidable.
right, not a good idea. except perhaps on an application by application basis.
@andy.fingerhut there is a similar problem with lazy sequences and exceptions, in particular reduce/reduced
... I ran into this in Scala some months ago, where if the escape-from-reduce
instruction is called lazily. long after the lazy sequence has been returned from its generating function, the exception will be thrown outside the try/catch
. Same principle, try
behaves like a dynamic variable binding.
Sure, the more general rule is "mixing constructs that rely on dynamic scope and lazy sequences is dangerous", where both binding
and try
are examples of constructs that rely on dynamic scope
Clojure does not have a ton of constructs that introduce dynamic scope, but there are a few.
what happens when I walk through a lazy sequence? does that guarantee to unlazify it? Or do I really need to copy it? I'm trying to think about how to write recursive-doall
no need to copy
it is a bit unsettling that in a function language looking at data changes it. But I understand the tradeoff of speed for correctness.
it's not really about speed vs correctness
oh?
it's the semantics of a lazy collection in an eagerly evaluated language (+ constructs that unwind stack)
explain more.. it seems to me that the lazy collection object changes destructively when I look at it. I.e. a function which looks at it will get a different result the second time it looks.
no that's not the case, a lazy collection doesn't change its value, it just caches computation
isn't it the case that rest
called on a lazy sequence will return an object of a different type before and after second
is called ?
the JVM objects change in place, that is true, but saying you get "a different result" raises the question of what you mean by "different". The result should be =
according to clojure.core/=
, even though JVM objects are being mutated.
@jimka.issy in some cases, yes, but that's because you shouldn't care about the type of lazy collections, clojure makes no promise at all about the concrete type of a value
that's an implementation detail
yes, internally there is mutability
but the interface is referentially transparent
yes but if you're writing a program which manipulates types, then you do care about types.
As I mentioned before, Clojure sequences are not linked lists of cons cells (except when you explicitly create them that way, but most functions don't). Using identical?
and expecting it to return true when a linked list of cons cells would is likely to bite you when you least expect it.
if you're writing a program which manipulates concrete types of clojure collections, you're usually doing something you shouldn't be doing
clojure doesn't tell you which type it will return from lazy-seq, that's by design
even map literals may have different concrete types depending on how many values they hold
but for sequences, the only thing you should care about is that they're sequences, not what internal type of sequences they are
I'm not convinced about the part of "shouldn't be doing". I'm still of the mind that it is something to be understood rather than to be avoided.
@bronsa, advice about good programming style is much appreciated. However, the problem I'm working is not really about choosing the best way to represent sequences, but rather static program analysis. Given a program, from a client/user, examine it and draw conclusions about its correctness. If I have concluded that a certain object is both an A and also a B, I need to ask the question whether the set of A's and the set of B's have a non-void intersection. If not, this means there is either an error in the program, or there is unreachable code. If I can prove that the intersection of A and B is void, ie. A and B are guaranteed to be disjoint, then I can draw an interesting conclusion about the code, or I can optimize the code in an interesting way. Granted, the most interesting theoretical problems are the corner cases of corner cases, which may never really occur in practice. Nevertheless, they do occur often in randomly generated tests. And passing the randomly generated tests improves the code, and gives more confidence of its correctness.
it could, and has, changed between clojure versions
e.g. atm (range)
returns an Iterate
it used to return a LazySeq
but you shouldn't care about it.
It seems like it might be reasonable to want to type things as "some kind of sequence" or "some kind of map". It is getting more fine-grained than that, that will cause frustration, I believe.
Program to the abstraction, not the concretion(s)
sure it can be useful to understand the internal details
but that's all they are, implementation details
if you depend on implementation details in any language, that'll bite you back at some point
of course there are specific instances where you do want to do that
but those are very few and special instances that should be considered just that, special
but these internal details can make for really different semantics of the program. And also you can depend on the details without knowing it. How can you figure out whether a program is depending on details of implementation. I think that's probably an unsolvable problem.
but these internal details can make for really different semantics of the program
I'd say that's only the case if you depend on behavior that's not promised by the interface
How can you figure out whether a program is depending on details of implementation
that's the job of the documentation
a silly example: laziness in clojure is best effort, not guaranteed to be 1 item at a time
I think it is fair to say that Clojure's official documentation is terse enough in some cases that some people "fill in the blanks" differently than the author intended.
fair
But I also believe that would sometimes happen even if the documentation were 10x longer.
It is difficult to determine whether someone's mental model of how something actually works, is a good enough fit for how it works.
@bronsa, not sure if you disagree or don't understand my claim. Given a program, I think there's no way to tell whether the program depends on some undocumented feature of the language or upon some implementation detail. my gut feel is this is related to the halting problem. Even if the user reads all the documentation, there's no way to know whether the program misapplied the rules and and still just accidentally works.
Given a program, I think there's no way to tell whether the program depends on some undocumented feature of the language or upon some implementation detail
programmatically, yeah surebut that's true for (almost) every language
agreed. functional languages, or even function style in an imperative language make some problems less likely. For example the use of non-mutable data structures eliminates some problems. But in clojure lazy sequences are mutable. They're not mutable from the sense of =
but they are mutable from other lenses.
I work on a functional, typed language that can formally verify itself, and even there you can still express "correct" programs that depend on implementation details
To me, it is something that I just need to understand, and avoid depending on accidental features.
saying "in clojure lazy sequences are mutable" is wrong
they use mutability in their impl
they're not mutable
i.e. there is no API that allows you to mutate a lazy sequence in place
it's an important distinction
doesn't second
mutate a lazy sequence?
even in haskell some purely functional collections can be implemented with mutability
no, I see what you mean but no
second
realizes the lazy seq
internally that does mutation to cache the return value
but that doesn't make a lazy-seq mutable
there's an important distinction between the interface of a collection and its implementation
as long as the interface is immutable, the collection is effectively immutable
otherwise nothing is really immutable
everything mutates some segment of memory in the end
A Haskell implementation (even if you remove the unsafe parts) is doing mutation all over the place in its implementation, by necessity.
Well, at least by necessity of all practical computers implemented today.
So I don't know Haskell, but. in Haskel does the type of an object change when you examine it? or does its type stay the same over its lifetime?
AFAIK, the type stays the same during its lifetime. In the JVM, the type of an object also remains the same over its lifetime. Are you thinking that lazy sequences in Clojure somehow cause the concrete type of an object to change during its lifetime? If so, what scenario is that?
There are many situations in Clojure where unless you know implementation details, you cannot easily predict across different versions of Clojure what the concrete JVM type of an object will be returned from a function like range
, for example, but for a single version of Clojure, knowing the implementation details, you can predict the concrete types. But Clojure doesn't promise concrete JVM types. It promises "some object that will implement this interface" (at least in most cases).
clojure doesn't "change the type of an object when you examine it" either
I guess I don't know enough about the situation to come up with an example.
I was under the impression that the following calls to type
would return different values the first and second. time. But @bronsa is absolutely correct that type
returns the same thing both times.
(let [obj (map (constantly 12) (concat '(1) '(2 3 4)))]
[(type (rest obj))
(second obj)
(type (rest obj))])
it's not theoretically impossible for it not to
but if it did, it would be an impl detail :)
you shouldn't even know what concrete type that returns
there are so many concrete seq types in clojure, the important thing is that they are all ISeq
(i.e. (seq? x)
is true for all of them)
My head hurts trying to understand "it's not theoretically impossible for it not to" 🙂
I can write a deftype that is a seq where the (type (rest x))
returns different values before/after (second obj)
I can't think of an existing clojure sequence that does that
but it's possible to make one
and it would still be a correct impl
That my head can handle more easily 🙂. But even if you made a situation where (type (rest x))
return a different type before and after a call to second
, it would not be because JVM objects were changing type under the hood, but because (rest x)
was returning a reference to a different JVM object on the two different calls. (that is implementation detail, yes)
yep
If any other Clojure beginners have read this far, please know that we are definitely straying out of materials most beginners would ever need/want to know.
Jim, if you want to think of a type system relevant for Clojure programmers, that tries to avoid implementation details about particular JVM concrete object types when Clojure doesn't promise them, then something like "The function rest returns an object x such that (seq? x) is true" is far more useful than "The function rest returns an object with class X, or Y, or Z, or W, ..."
yes, but computationally seq?
is the same as (fn [x] (instance? clojure.lang.ISeq x))
It does not happen very often, but that list of JVM classes for which (seq? x) returns true often gets larger when someone implements a new Clojure collection type. It is common to create a custom seq type that is expected to be more efficient when creating a sequence of elements of that collection.
agreed on what seq?
is equivalent to, and that is an implementation detail that would shock me to my core if it ever changed, so I think you can rely on that.
seq?
will always return true for sequences
The main point is that clojure.lang.ISeq is a Java interface, not a Java class, and an open-ended set of Java classes can implement that interface in the future.
(seq? x) = true
is what makes x
a sequence :)
note that ISeq
is not a concrete class, it's an interface/abstraction
which comes round to what I was saying earlier, clojure makes promises on abstractions, not concrete types
and testing for ISeq
instead of using seq?
is ok, since ISeq
is part of the public API, not an impl detail
Hi, how do I go from a seq of values to a seq of partial functions?
(range 0 16 4)
(defn my-fn [n other] ...)
'((partial my-fn 0) (partial my-fn 4)...)
map
(map (fn [x] (partial my-fn x)) (range 0 16 4))
Hey everybody, I’m having a little trouble with a custom generator for a spec
…see thread. (I figured I would post here since it feels like a beginner question.)
So here is a little toy example that I’m working on for learning purposes.
(s/def ::x ::nice-number)
(s/def ::y ::nice-number)
(s/def ::point (s/keys :req [::x ::y]))
(s/def ::monotonically-increasing-x
(s/and (s/coll-of ::point
:distinct true
:min-count 2)
;; x values should be monotonically increasing
#(apply < (map ::x %))))
The problem is generating examples like with exercise
because of the monotonic increasing requirement
I can get a set of increasing numbers like this:
(def sorted-x-gen
(gen/fmap #(vec (sort %))
(gen/such-that not-empty
(gen/set (s/gen ::x)))))
But I’m having trouble actually going from that to an actual coll of monotonically-increasing-x
. Any suggestions?
I believe if you do a different function than vec
on the return value of (sort %)
, e.g. (map point-from-x-value (sort %))
, where point-from-x-value
is a function you write that takes an x value and creates a value that satisfies the ::point spec from it, perhaps by combining it with a randomly generated y value, that might be what you want.
Yeah I could generate the x and the y at the same time something like this….and then just fill in some-fn
(gen/fmap some-fn
(gen/such-that not-empty
(gen/set (s/gen (s/cat :x ::x :y ::y)))))
I could also try gen/map
to generate a map directly and then sort that….will tinker around with both options, thanks
I am pretty sure that this kind of use is exactly what fmap
was intended for -- calculate some data with whatever restrictions you find useful, and then transform it into something else that satisfies your spec.
Hey team, noob ring question:
(compojure.core/routes #'routes/app ...)
https://github.com/PrecursorApp/precursor/blob/master/src/pc/server.clj#L87-L90
I see that in order to make hot reloading work, we pass vars in as the handlers.
In my old project, I didn’t quite do it that way. Instead, I made sure that I was passing vars as the fn definition:
https://github.com/stopachka/jt/blob/master/server/jt/core.clj#L644
(POST "/emails" [] emails-handler) ; emails-handler i guess gets captured as a var
Is my intuition correct than, that had I written:
(compojure.core/routes #'routes/app ...)
Than, I wouldn’t need to wrote about hot reloading in per route?
So I could write:
(POST "/emails" [] (fn [req] ...) ; emails-handler as anon fn
Is there’s a good writeup that goes into this var business + ring, would love to read it!