A very simple but long running loop
-based function manages to make all 8 CPU cores busy.
How come? Is it GC or something else? I don't do there anything but churning numbers and changing a single transient vector.
@p-himik check for reflection warnings. Sometimes, reflection can do bizarre things...
Thanks. Yeah, maybe it was reflection. I've enabled the warnings later on but only after some drastic modifications that by themselves resulted in a nicer memory usage.
Initially, I was using a vector of vectors in a very tight loop that gets and sets nested double
values). The first run would finish just fine, the second would crash with OOM. So another candidate is boxing, perhaps.
And I'm not exactly sure, but I think the first time I ran it it used just 2 cores. And the second time it started using all 8 cores and it takes it quite a bit more time to complete. Same input, no randomness anywhere.
What the hell. It failed with OutOfMemoryError
, but the first time it worked just fine.
@ahmed1hsn Picking up your question here as it's somewhat of a thread stealer. I scanned the slide deck on zio. It seems to be about taking the type system in scala and making it (I don't understand how) more about algebraic types. Which, I infer from the slides, are about making functions which behave according to algebraic properties (association, etc..) That's a massive shift. I'm not sure it would still be clojure at the end.
that's a strong indicator that it was gc using the cores
perhaps it is leaking resources - creating data that remains accessible via some scope outside the loop, or creating something that cannot be collected?
as a matter of opinion / design, I personally think the biggest problem with scala is trying to simultaneously work with the JVM (very weak type system, generics as a compile time fiction only), and use the best features of modern type safety (very strong type system, implicit and inferred types) this is exacerbated by a pattern of breaking code compatibility between compiler versions
the jvm type system, as much as it has one, is very compatible with lisp + interfaces, which is what clojure uses, and much of clojure's elegance and simplicity comes from embracing this
simultaneously trying to be compatible for interop with the jvm, and enforcing typing concepts the vm isn't capable of representing directly, is IMHO inevitably going to make things worse not better
I really doubt that - the function is pure and I immediately discard its result.
and it isn't eg. creating new lambdas via eval?
or otherwise creating new classes?
type checking is most cases/can be a compile time feature (outside of optimisations & co coming from type annotation), you don't really need a cooperating vm
but that's where scala gets extremely messy
but sure, it's simple to do less and stay close to the vm if/when possible
sure, it could be done right, I don't see real world evidence of it happening
typed racket & gleam come to mind
hell, even rust
equivalent problem: make a type safe language with inference, that allows inline machine specific assembly
and yeah, rust might count for my example :D
It's hard, sure. And difficult to do it right
this reminds me, I do want to learn rust
though I might just learn zig instead
zig is an odd thing. I'd prefer rust, but personally I have no use case to learn either
zig is what people wish (and often delusionally pretend) C would be
I'd only use rust to work with embedded devices & co
it's simple in the way a lisp is (maximal utility from minimal features), close to "the metal"
I know, I just find it odd for a new lang in 2020 🙂
right, I'd use rust or zig for DSP, where using an allocator could mean your RT constraints fail
@mpenet that's everyone else's fault for not making that happen sooner IMHO
I'd prefer to work in rust in these cases, but that's just me
but I really don't want to learn rust again
and likely won't
(used it quite a bit in the 0.3.x days)
I like simple things: I find fennel-lang for instance quite nifty, it has no bells & whistles, but is very pragmatic and fit the bill in some the use cases I encountered when clojure was not an option.
I attended the first fennel conf (it was four of us at a bar for a few hours), and hosted the second one (about 10 of us in a rented conference room)
then, sure it's lua with (lisp) lipstick, so aging community & not as many options as the jvm, but it's very usable nonetheless
oh cool
I implemented the first version of quote / quasiquote / unquote also, but I don't think my grubby hand prints are on that code any more
it self hosts now!
yes
since 0.5.x
no more gnarly lua to deal with (almost)
sorry, I'm really not used to other people knowing anything about fennel and its features, I guess I'll have to get used to it :D
🙂
I'm slowly migrating my awesome wm config to fennel
you might be able to automate that, if I recall technomancy used a script to port part of lua's fennel code with it
and it looks like fennel will get checked in as a part of the neovim repo soon (to be used in the tree-sitter impl)
right, antifennel
yes
antifennel takes lua and makes equivalent fennel code, I should be able to migrate all my lua code, and just leave a fennel bootstrap stub in its place
seems doable
yup
I did the same a long time ago
but I am no longer using awesome
I did it manually tho
did you find a better WM?
emacs 🙂 I basically just use a browser and emacs and I rely on boring gnome, I stay full screen with one or the other all the time. My workflow with awesome was quite simple/similar.
terminals in emacs+libvterm
aha, yeah, awesome + tabs in neovim is my version, but on multiple screens when possible
also fennel: the ability to compile with bundled lua interpreter resulting in very small files is quite cool
yeah, I had some wacky ideas of how to do that, but luckily Phil found a simpler and more reliable way
there's been talk about bundling byte code instead of source for the fennel code bundled as well, making it slightly more brittle or harder to debug(?) but smaller
it's basically a free feature, since it's what lua was designed for
I've long threatened to bundle fennel into a general purpose kafka tool, maybe I'll find time for it during my current sabbatical
it's super easy to ship a single binary with everything including the ability to make it repl'able
quite cool, really
and there's even a lib for nrepl :D
yeah
(that is, to make it an nrepl server, usable from those few client impls that don't take clojure on the server for granted...)
as I said, no need to mess with rust, I'd be busy enough with other toys
🙂
I'd go all in on fennel and skip my recent arm64 exploration and planned zig exploration, except I strongly suspect that I want to do things in DSP that lua can't quite hack directly
when you are that close to the metal there are not many options sure
learning 64 bit arm assembly is extremely humbling, I thought I was so much more clever than I am
though I think I've fully internalized 2's complement and little endian as formats, which I guess is fun trivia though it's sad I even need to care
yeah that sounds more involved that what I could afford with the little free time I have these days.
I had a somewhat embarrassing accident with hallucinogens (never again, I swear), which left me hospitalized for a week with no internet, no hardcover books, no privacy. Working out some basic architectural stuff like 2's complement encoding on paper with a pen (no pencils allowed) was actually a nice way to pass the time
better would have been not to land myself there in the first place of course
apologies if that's TMI
kids+covid here
yeah, that's a lot of work
I mean, we didn't get covid, just small kids@home
🙂
what we need is a "clojure-junior" that you can teach kids so they can be your offshore for tedious parts of your projects
if their work passes the unit tests, just drop it in
a bit too early for them, stuff like logo/scratch in time
I showed scratch to my girlfriend and her children, but it's definitely a "lead a horse to water" type situation
logo is just a lisp subset that is simple enough to not need parens any more
If you want to do statically typed functional programming on JVM, then Scala is great option.
I cross my fingers that Eta becomes usable (with full featured interop) - the way scala works with the vm is kind of messy
scala pretty much support any style of programming 🙂
which is part of the problem (kidding, kind of)
Scala is trying to move away from Haskell's abstractions e.g. Cats, ScalaZ. With ZIO and zio-prelude they are trying to do FP in more native way to Scala idioms.
you can use your favorite idioms in your code, but libraries will force you to use the lib authors favorite idioms
^
eg. a friend was doing a data graph based project, they knew about the pitfalls of implicits and avoided them, but they were cornered into implicits and all the problems they bring because they needed a graph lib that used them everywhere
as a random example (said friend refuses to use scala any more)
Statically typed language guys have argument that it helps in refactoring large code bases. How much true this statement is and how does this help with respect to Clojure for example?
@ahmed1hsn I'm obviously biased but I find clj-kondo and rg/grep helps me refactoring quite well
That is an advantage with static languages. The Clojure solution tends to be using good system boundaries with well defined interfaces, but it's definitely an art not a science. I do miss how easy refactoring was in OCaml for example.
Automated refactorings can also yield garbage code, because the amount of thinking is reduced
@borkdude but a good type system can guide a dev through a manual refactor - in OCaml if I change a module interface, just running :make
in nvim takes me to the needed edits, until I don't get errors, then I know I'm done
there's no real way to have that in Clojure (though static tools and unit tests help a lot)
that's true. there was a nice tweet about this from Stuart H.: https://twitter.com/stuarthalloway/status/1234261008560115712
:D
I often enjoy the glib version: "tooling is a language smell"
that is, the kind of tools users of $LANG consider indispensable is a peek into the things the language itself is poorly suited to handle
funny no-one mentioned spec so far ;)
I've seen bad refactors that got further than they should have because specs were defined and naiively trusted
oops, nothing actually checked or enforced said specs, so they were about as useful as comments declaring return / arg types
in a static lang, the type is a property of the code, in clojure a spec is not a property of the code or the data alone, it's a checkable assertion about the data seen by specific code in a specific run time context
@noisesmith refactoring workflow in Scala is similar to OCaml
How is the situation of core.typed or TypedClojure?
Btw #lsp also has some nice refactorings nowadays
@ahmed1hsn I think core.typed is still pretty much a research topic?
@ahmed1hsn I don't know those tools / variants but on the topic of architecture, even with a strongly typed language the types can't extend past a single VM without making a big brittle mess. Your individual app might be strongly typed but your microservice can't be. Eventually, at some level, once you have two computers running, you have the same typing guarantees as Clojure (that is, your data can be checked at runtime for validity but nothing can be known about it statically).
The attempts I've seen to act like data between vms / processes is typed only make the problems worse
(the OCaml approach is cute: you can share typed data between processes, but it's an instant failure / abort if the two processes aren't running literally the same compiled binary)
is that kinda like serialized objects in Java but worse?
better :D no side effecting constructors, and it bails out if a trivial assertion isn't successful
it allows what java gets by having multiple threads in one process, but without shared memory, which is kind of cool actually
and it's network transparent
but it's still "cute" rather than "awesome" - it's dealing with a fundamental problem that nothing handles perfectly (though I suspect erlang has found a better local optimum than most)
point being, once you leave the realm of "everything in one process", you get all of Clojure's problems anyway
erlang gets aways with it via good pattern matching, records, no nil, named tuples and semi decent static analysis. All in all it's a good combo
message passing is first class too, so it's optimized for that, but that's not free
plus a good infrastructure for IPC with retries / monitoring
I haven't yet used it in anger though, I do intend to fix that
generally working with erlang codebases is less stressful than clojure codebases imho
when it comes to refactoring I mean.
@borkdude CircleCI used core.typed but they moved to Prismatic Schema: https://www.mail-archive.com/clojure@googlegroups.com/msg73423.html https://circleci.com/blog/why-we're-supporting-typed-clojure/ https://circleci.com/blog/why-were-no-longer-using-core-typed/
Beyond that I haven't seen core.typed in production use.
@ahmed1hsn Yes. I still think Schema is convenient, but it's also still only runtime only, like spec. Btw, I think clj-kondo could pick up on Schema's defn
, but since there are now at least 3 or 4 libs (schema, spec, malli) doing their own thing, I don't know if this has any priority.
(some work I'm doing related to spec: https://gist.github.com/borkdude/c0987707ed0a1de0ab3a4c27c3affb03#gistcomment-3444982)
It would be great if the clojure community settled on one thing eventually
right now it's a bit of a scattered ecosystem
Honestly, Schema was pretty great and simple (in the sense of easy :P)
#malli wip with clj-kondo
Some perceived flaw of schema was that their schemas were closed by default. But it's pretty easy to make them open: {:foo s/Str s/Any s/Any}
?
Also, you can use closed schemas as security measures, like "don't allow a non-admin user to change the status of a user in this REST path". Rails did decide that everything was open by default, then changed the approach (breaking lots of code in the process) because of lots of bugs and security issues on production code - even github suffered from that flaw, I believe
the approach is spec 2 is that "closing" is a property of the check, not of the spec
something you can enable during validation, conform, etc if needed
yeah, in practice I added s/Any s/Any
to every hash-map in schema, and 98% of my schemas were for hash-maps
tangent: I wonder if an entropy detecting tool would be useful for catching mismatched components. My idea being if you have a "state" or "component" map tree with the same data repeated many times at many levels of branching, that could indicate an app that's growing faster than it's being designed
the open-ness part could be another namespaced key, that would get validated even if not specified in the s/keys
so s/Any doesn't work the same way (lack of global registry is the real issue)
different approaches
Does this hold true when using something like Avro? I’ve never used it, but it appears to be “types on the wire”.
you can just programmatically merge schemas though which makes it less of a problem
yes, different approaches. one assumes an attribute is always the same thing, the other not
malli has much of that flexibility too and possibly more, maybe it will be schema.next in terms of community impact
I am not a fan of having both approaches for something that should aim to become a standard personally. I tend to prefer Spec style, but that's just me
but you can't use the same types in your application code
what did you mean with "both approaches"?
and IMHO trying to combine the avro type with your program types makes things worse not better (based on the attempts I've seen)
we're circling back to Scala's discussion 🙂 imho enabling many different styles to deal with something that should be a shared by the community at large is not great
yes, I intented to ask which styles you meant
global registry vs first class specs
I think malli also has a global registry, local is optional
yeah that's what I mean
I don't follow. This is the same as spec? You don't have to use a local registry, it's an edge case that is supported optionally
the value of a (namespaced) key in a map will always validate the same way
in spec
that's not true with Schema/malli
it can be, but not necessarly, depends on the style used
ah I see
So @ikitommi, malli doesn't have something like s/def
?
> Example to use a registered qualified keyword in your map. If you don't provide a schema to this key, it will look in the registry. I wonder how you register a qualified keyword associated with a schema globally
I think it does since it has registries
I guess you can define a single registry that you'd use everywhere (if you follow that rule)
in that case, just use the global registry right
it works only if you control all the code you work with.
lib author A can define its own, or not
malli is immutable by default but supports global mutable registries. Just not by default. But, it's pre-alpha, feedback welcome.
no fdefs yet, but will be, soon.
afaik you cannot enforce it. You have to know where to get the info from the registry of the lib, merge it with yours potentially and then do the same for every lib that has specs registered (if any)
that's where imho having too many options (or styles) hurts
but maybe I missed the latest developments
currently, the lib needs to have it's own registry (map) and you can compose that it with your app registry. Could poll if people want a global mutable thing.
one could ask the question if this is a common thing: validating maps from other libs
Ah yea. It appears similar to the keyword conversion problem I find at the borders of my clj systems.
you can always force a immutable (and serializable) registry inside your app, for some parts where that matters
take that in the context of a haskell like type system, is it useful to have the guarantee a Thing is always assumed to be the same ?
imo yes. If you make the // between an attribute (nsed key) and a Type
then outside of ns'ed keys it's the wild west
@mpenet it's probably the other libs responsibility to create those maps, which can from then on be assumed to be true Things?
Not necessarly
Unless you like to create contructors for all your data 🙂 and guarantee than any producer will use these
But maybe it's quite personal, some people like it one way or the other I imagine. Right now I tend to prefer attributes to have a strong identity (like in datomic or graphql), but I guess that might not be the case of everybody, clojure enables many ways to do the same thing.
If I want to expose a clojure map/data oriented api as JSON, but I use qualified keywords, would the "default" approach just be to stringify the keys, skip the colon, ie :my.qualified/key
to "my.qualified/key"? Just want to make sure I'm not missing some json convention or issues or something
I'd agree that outside my app namespaces aren't very useful, and inside my app I want them if you don't have control of how data gets into your app, and it doesn't happen in a small number of easy to intervene places, namespacing keys is only one of many problems you are going to have
I'd use well placed ingestion / dispersion middleware, not a special "encoding" of the namespace into the key
another thing to consider: {:foo/bar 1 :foo/baz 2 :quux/OK 3}
-> {foo: {bar: 1, baz: 2}, quux: {OK: 3}}
For some reason, that doesn't sit well with me. I think maybe that's the only point in Val's article that I agree with: don't treat qualified keywords as something structural. I think it's because, typically, at the edges you're not going to be mapping to/from a nested (unqualified) structure.
I prefer to explicitly map qualified names<>unqualified. We usually do not control external resources I did this lib: https://github.com/souenzzo/eql-as
This is a long-discussed topic, and as far as I know there are no conventions. • https://andersmurphy.com/2019/05/04/clojure-case-conversion-and-boundaries.html • https://vvvvalvalval.github.io/posts/clojure-key-namespacing-convention-considered-harmful.html • https://juxt.pro/blog/idiomatic-integration
@jjttjj Both Cheshire and clojure.data.json
allow you to specify how keys are turned into JSON keys (strings) so you can choose to drop the qualifier or keep it.
(their defaults are opposite, BTW)
Trying to find the forum post where this is discussed at length...
Here is one discussing the “considered harmful” post above: https://clojureverse.org/t/clojures-keyword-namespacing-convention-considered-harmful/6169/6 I don’t think that’s the one I had in mind tho :thinking_face:
user=> (require '[cheshire.core :as ches] '[clojure.data.json :as dj])
nil
user=> (ches/generate-string {:a/b 1 :c/d 2})
"{\"a/b\":1,\"c/d\":2}"
user=> (ches/generate-string {:a/b 1 :c/d 2} {:key-fn name})
"{\"b\":1,\"d\":2}"
user=> (dj/write-str {:a/b 1 :c/d 2})
"{\"b\":1,\"d\":2}"
user=> (dj/write-str {:a/b 1 :c/d 2} :key-fn (comp str symbol))
"{\"a\\/b\":1,\"c\\/d\":2}"
user=>
@jjttjj /cc @cjsauerc.d.j escapes /
by default but that can be turned off.
Ah here it is. I think it was the precursor to Valentin’s blog post actually: https://clojureverse.org/t/should-we-really-use-clojures-syntax-for-namespaced-keys/1516
good to know! wouldn't have guessed that with the escaped slashes. Haven't settled on a json lib yet
Those are both essentially the same article, both by the same author, two years apart.
(and I still think he's mostly wrong 🙂 )
The discussion itself is what I wanted to link to, because there’s quite a bit of disagreement in this area.
I remember seeing it and not agreeing, and now a month later I need some JSON compatibility and now see it in a whole new light haha