clojure

New to Clojure? Try the #beginners channel. Official docs: https://clojure.org/ Searchable message archives: https://clojurians-log.clojureverse.org/
2021-05-19T01:15:27.040400Z

any thoughts on http://flur.ee anyone ? "Open source semantic graph database that guarantees data integrity, facilitates secure data sharing, and powers connected data insights." "Fluree is more than a database and more than a blockchain. It is a data management platform that merges the analytic power of a modern graph..." it's written in clojure

1🍬4🤯
borkdude 2021-05-19T07:36:06.041300Z

Never heard of it, but it looks interesting

Aron 2021-05-19T07:56:05.041600Z

don't want to be that guy again but it doesn't appeal to me • the looks is a very generic template/style • the page is mostly filled with buzzwords • there are statements all over the place that rise serious questions about security, scalability, maintenance costs that are not answered immediately • json based query language? why do they have their own if they support others? Not saying there is anything bad there, but I am curious for the reasons • seems like a lot of stuff that is reimplemented for no obvious reasons https://docs.flur.ee/docs/1.0.0/schema/functions

1✔️
Trey Botard 2021-05-19T21:18:36.080600Z

thanks for bringing fluree up @thegobinath I'm a dev advocate there, so i can try to answer any questions, if needed.

Trey Botard 2021-05-19T21:20:14.081Z

@ashnur we have a json based query language to facilitate easy interop with other languages and via query/transaction calls via http, and yes our marketing site is somewhat buzzwordy, but we've got some pretty good stuff under the hood.

Aron 2021-05-20T02:21:08.103200Z

Well that's just it, 'easy interop' sounds like something to sell with not to build upon 😞 Really easy interop is when you don't even need to learn a new DSL, no?

2021-05-20T08:35:03.110300Z

@trevor Interesting! I’m curious about the origin of the db. It’s a technical product, but I can’t find any technical founders, is this correct? I’m looking at this page https://flur.ee/about-us/

1👀
borkdude 2021-05-20T08:35:47.110700Z

The fact that it supports RDF/SparQL is a big plus for us, as we use this format internally and it's a standard

1👍
Trey Botard 2021-05-20T13:37:56.126400Z

@jeroenvandijk Brian Platz is the technical founder and CEO

1👌
Trey Botard 2021-05-20T13:41:43.126800Z

@ashnur thats why we also support GraphQL, SQL, a subset of SPARQL, and you can call directly via Clojure. But if you dont know Clojure and you're familiar with Javascript or Python, writing a JSON is something you are more than likely familiar with and gets you some of the benefits which using one of the other query languages doesn't support, namely time-based queries. If that is something needed by your app, then using FlureeQL in JSON or Clojure is necessary.

2021-05-19T01:55:07.040900Z

Looks pretty nice from the front page

borkdude 2021-05-19T07:36:06.041300Z

Never heard of it, but it looks interesting

Aron 2021-05-19T07:56:05.041600Z

don't want to be that guy again but it doesn't appeal to me • the looks is a very generic template/style • the page is mostly filled with buzzwords • there are statements all over the place that rise serious questions about security, scalability, maintenance costs that are not answered immediately • json based query language? why do they have their own if they support others? Not saying there is anything bad there, but I am curious for the reasons • seems like a lot of stuff that is reimplemented for no obvious reasons https://docs.flur.ee/docs/1.0.0/schema/functions

1✔️
Aron 2021-05-19T07:59:27.045400Z

I spent over a year deep diving blockchain tech and in the end my conclusion was that 1. it's a useful technology in a certain situation, e.g. when multiple transport companies want to use a single deposit, putting a blockchain on the system gives audit and a new company needs only to setup the tech and can integrate immediately without any further costs. 2. it still requires integration with the law and everything else like everything else What blockchain is not good for, for physical and philosophical reasons that I am very much happy to delve upon if anyone is interested, is to implement general solutions (e.g. like a programming language or a database).

Aron 2021-05-20T08:32:02.110Z

• yes, ethereum is based on this idea, but it's more like a public research project than something that's commercially viable, checking the list of biggest apps: https://www.stateofthedapps.com/rankings/platform/ethereum • In my honest opinion, yes they hold lots of value, but not necessarily in what they promise. I think most of these projects are scams to bring in investor money and then run with it. The developers building most of these projects are there because the work is interesting and the pay is good. I am speaking from experience, we have implemented quite a few POCs for ethereum and was involved in a couple of more serious projects as well. Open ended projects tend to go on longer.

Roman Petrov 2021-05-19T10:58:54.047700Z

Hello! I'm looking for a good Clojure/Java developer in Russia. Do they exist? Please contact me directly for details.

caleb.macdonaldblack 2021-05-19T11:01:04.048600Z

Can I destructure keywords ignoring namespace? For example I have a generic function that accepts a name key and will handle them all the same regardless of the namespace

2021-05-19T11:01:16.048800Z

you might want to ask in https://t.me/clojure_ru

Roman Petrov 2021-05-19T11:01:38.049400Z

thanks!

caleb.macdonaldblack 2021-05-19T11:01:51.049900Z

I could just strip the namespace but maybe there is a feature in destructuring for this

2021-05-19T11:02:51.050Z

no, destructure needs namespace for fully qualified keywords

caleb.macdonaldblack 2021-05-19T11:04:26.050200Z

Ah no worries. Thank you

andrewboltachev 2021-05-19T11:44:17.051200Z

Hello. If I need to send EDN data on the wire (from my web app's backend to frontend), is a function like this a way to go?

andrewboltachev 2021-05-19T11:44:21.051400Z

(require '[cognitect.transit :as transit])
(import [<http://java.io|java.io> ByteArrayInputStream ByteArrayOutputStream])

(defn to-edn-str [data]
  (let [out (ByteArrayOutputStream. 4096)
        writer (transit/writer out :json)]
    (transit/write writer data)
    (.toString out)))

(to-edn-str [:abc 1 2])

thheller 2021-05-19T11:44:57.052Z

thats transit not edn? for EDN just pr-str

andrewboltachev 2021-05-19T11:45:34.052600Z

yes. Still not getting 100% of difference (and reasoning)

thheller 2021-05-19T11:46:23.053400Z

transit is better for sending stuff over the wire, so that is fine. but calling it to-edn-str is rather confusing since what you get is transit json string

andrewboltachev 2021-05-19T11:46:58.054Z

ah true

andrewboltachev 2021-05-19T11:49:59.055800Z

better in which sense btw? is that 'cause JSON can be "gzipped" (or something) that is more optimal to send than EDN, which for the browser is mere plain/text?

thheller 2021-05-19T11:51:08.056700Z

no, both a just text strings. transit is just a little faster for parsing and a little smaller overall

thheller 2021-05-19T11:51:29.057Z

gzip works for all, no difference there

andrewboltachev 2021-05-19T11:51:55.057600Z

ah, yes, so that's the browser's/server's parsing algorithm

thheller 2021-05-19T11:52:28.058300Z

no, as far as the browser is concerned its just a string. it has no notion of transit or EDN

andrewboltachev 2021-05-19T11:53:16.059200Z

I mean, when it tackles that transit JSON and later walks the tree (or sth) to turn it into proper CLJS objects

andrewboltachev 2021-05-19T11:53:26.059700Z

as opposed to parsing the EDN string

thheller 2021-05-19T11:53:43.060100Z

"it" doesn't do that. YOUR code does that. either via the transit reader or the EDN reader.

andrewboltachev 2021-05-19T11:54:07.060600Z

ok. agree. thanks

2021-05-19T12:16:40.062300Z

Might be a vague question. I am going to implement a system with several modules. Each module communicates with other through core.async channel. Haven’t touched this part before. Is there any example code/project for reference? I am mostly interested in the coordination and message passing (pub/sub) between these modules.

2021-05-19T12:52:17.062400Z

Maybe @ognivo?

alexmiller 2021-05-19T13:02:37.063500Z

The ns is stripped by destructuring bc local bindings are always unnamespaced

Ivan Fedorov 2021-05-19T13:04:34.063700Z

@admin055 ❤️

1👍
andrewboltachev 2021-05-19T13:07:12.064400Z

@i is a module something abstract? i.e. they still will be spawned by a single process?

andrewboltachev 2021-05-19T13:07:18.064600Z

(like lein run)

Ben Sless 2021-05-19T13:57:40.065700Z

Anyone here has experience with Jackson serializers? I'm trying to get it to use an IterableSerializer instead of a CollectionSerializer for a LazySeq with jsonista

2021-05-19T14:09:49.065800Z

yup. still spawned b a single process.

2021-05-19T14:46:28.066100Z

if they were in separate jvms then core.async wouldn't help at all. also please don't use lein as a prod process launcher, lein is a build tool and the run task is a convenience for developer iteration

2021-05-19T15:13:33.067900Z

for a while I've been avoiding jackson because of the brittle version sensitive deps and using clojure.data.json instead, ymmv but it never turned out that json encoding was my perf bottle neck

ghadi 2021-05-19T15:14:55.068800Z

+1 to all of that, and when I do use jackson, it's not the ObjectMapper ORM-ey stuff

2021-05-19T15:20:24.069900Z

@seancorfield I remember you are doing some removing Jackson work from a codebase. How’s that going?

2021-05-19T15:20:55.070700Z

My concern is, Jackon might be indirectedly referenced by other libs. So it’s still got used.

kenny 2021-05-19T15:21:05.071Z

I updated from data.json 1.1.0 to 2.3.0 and am getting some very odd results back. I'm not sure exactly what this is but in Cursive, one of the decoded strings gets printed in the REPL as a series of NULs (see attached screenshot). I'm also not sure how to repro this since it appears to have something to do with how the inputstream is originating. I am calling a GCP API with the Java 11 HTTP client and getting back an inputstream. I'm then calling json/read on the result of that.

(def resp
  (java-http-clj.core/send
    my-req
    {:client http-client
     :as     :input-stream}))

(with-open [rdr (io/reader (:body resp))]
  (json/read rdr))
The last form is the one returning the oddly decoded JSON. If I spit the inputstream to a file and run the same code with a reader created from a file, the decoded result is correct (no NUL).
(with-open [rdr (io/reader (io/file "test.json"))]
    (json/read rdr))
Seems like this is an issue with the 2.x data.json version. I will revert back to 1.1.0 for now. Happy to provide more info if the maintainers are interested.

2021-05-19T15:26:31.071300Z

sure, but the problem with jackson is the version change brittleness, so each time you remove a usage of jackson you are mitigating that problem

2021-05-19T15:27:00.072Z

it's not a question of "use it anywhere" vs. "don't ever use it", it's a strategy of reducing the number of places it's used to reduce the brittleness that its usage introduces

alexmiller 2021-05-19T15:27:08.072300Z

@kenny would be great to learn more about what's up so we can fix if needed - we have a channel #data_json if you could isolate something

1
Ben Sless 2021-05-19T15:39:00.072700Z

I have some use cases where a large chunk of my CPU is wasted in Jackson

Ben Sless 2021-05-19T15:39:23.072900Z

Jsonista is faster so I'm trying to work with that

2021-05-19T15:54:14.073100Z

be careful with that analysis - for example, if jackson is consuming a lazy seq, the profiler will describe the work done realizing that seq as jackson's CPU usage

seancorfield 2021-05-19T15:57:39.073300Z

@i We got to the point where we pin the Jackson version for just one subproject now (to 2.8.11 — because 2.9.0 introducing a breaking change around null handling, so at least we’ve tracked down why it causes failures). All the other projects just ignore the issue now and let deps bring in whatever version of Jackson they want (mostly 2.10.x as I recall).

Ben Sless 2021-05-19T16:00:02.073500Z

Yeah, I know, and this whole thing started because I saw that lazy seqs are consumed twice because the CollectionSerializer calls .size() first

Ben Sless 2021-05-19T16:00:49.073700Z

What I was hoping to do was avoid intermediate allocations as much as possible, it's a very large stream

Ben Sless 2021-05-19T16:02:01.073900Z

This analysis still holds

2021-05-19T16:08:43.074100Z

lazy seqs are cached though - that would cause heap pressure but not CPU (except indirectly via more GC work)

magra 2021-05-19T16:10:18.074700Z

Hi, I would like to point a new clojurian to this slack but I forgot where I got the invitation from.

dpsutton 2021-05-19T16:11:03.075100Z

i think http://clojurians.net will help in this case

magra 2021-05-19T16:13:56.075500Z

@dpsutton Thanx! That worked!

1👍
Ben Sless 2021-05-19T16:14:08.075600Z

It is an extremely garbage intensive piece of code

borkdude 2021-05-19T17:35:50.076200Z

I think I figured it out: https://gist.github.com/borkdude/0a99a4f413b509315d54e1c68f861fad

borkdude 2021-05-19T17:38:07.076400Z

it's a bit unfortunate that the API requires to repeat logic like (.isArray ...) twice, one for the tag and once for the value which I can imagine isn't so good for performance, but ah well

borkdude 2021-05-19T19:24:40.077400Z

Has anyone here ever used a different arity than the 2-arity transit/write-handler? If so, could you explain to me why?

dpsutton 2021-05-19T19:39:20.078800Z

without committing to any official policy, is there a ballpark number of votes on http://ask.clojure.org that get tickets added to a roadmap or release candidate?

alexmiller 2021-05-19T19:39:55.078900Z

no, I look at them from top down though for pulling into consideration

alexmiller 2021-05-19T19:40:22.079100Z

most have ≤ 1 so, more than that is noticeable :)

alexmiller 2021-05-19T19:41:15.079300Z

https://ask.clojure.org/index.php/questions/clojure?sort=votes is a starting point

dpsutton 2021-05-19T19:41:27.079500Z

haha. yeah. was just wondering if my fourth vote might hit some threshold 🙂

alexmiller 2021-05-19T19:41:52.079700Z

even then, this is just one of many things serving as fodders for attention

dpsutton 2021-05-19T19:43:26.079900Z

makes sense. thanks for the info

Trey Botard 2021-05-19T21:18:36.080600Z

thanks for bringing fluree up @thegobinath I'm a dev advocate there, so i can try to answer any questions, if needed.

ghadi 2021-05-19T21:19:57.080800Z

oh I thought it was 6 votes. I guess I can stop bribing folks!

Trey Botard 2021-05-19T21:20:14.081Z

@ashnur we have a json based query language to facilitate easy interop with other languages and via query/transaction calls via http, and yes our marketing site is somewhat buzzwordy, but we've got some pretty good stuff under the hood.

2021-05-19T21:51:18.081400Z

Best autocomplete for Clojure?

indy 2021-05-19T22:07:45.088400Z

How do I make the following function handle 'sequency' collections i.e. sets, lists, vectors properly? Feels like I have to deal with multiple specifities example (conj nil x) returns a list, so the seq-init is not the right one because I'm doing a conj that adds the element at the start of the coll.

(defn deep-remove-fn
  {:test
   (fn []
     (is (= ((deep-remove-fn empty?) {}) nil))
     (is (= ((deep-remove-fn empty?) []) nil))
     (is (= ((deep-remove-fn empty?) '()) nil))
     (is (= ((deep-remove-fn empty?) #{}) nil))
     (is (= ((deep-remove-fn nil? boolean? keyword?)
             [:a {:c true} 9 10 nil {:k {:j 8 :m false}}])
            [{} 9 10 {:k {:j 8}}]))
     (is (= ((deep-remove-fn false? zero?)
             {:a 90 :k false :c {:d 0 :e 89}})
            {:a 90, :c {:e 89}}))
     (is (= ((deep-remove-fn empty?)
             {:a 90 :k {:m {}} :c {:d 0 :e #{}}})
            {:a 90 :c {:d 0}}))
     (is (= ((deep-remove-fn empty?)
             [#{7 8 9} [11 12 13] '(15 14)])
            [#{7 8 9} [11 12 13] '(15 14)]))
     (is (= ((deep-remove-fn empty?)
             {:a {:b {} :c [[]]} :k #{#{}}})
            nil))
     (is (= ((deep-remove-fn nil?)
             {:a {:b {} :c [[]]}})
            {:a {:b {} :c [[]]}}))
     (is (= ((deep-remove-fn nil? empty?)
             {:a {:b {} :c [[]] :k #{#{}}}})
            nil)))}
  [&amp; remove-fns]
  (let [remove-fns
                   (for [remove-fn remove-fns]
                     #(try
                        (remove-fn %)
                        (catch Exception _
                          nil)))
        removable? (apply some-fn remove-fns)
        map-init   (if (removable? {}) nil {})
        seq-init   (if (removable? []) nil [])]
    (fn remove [x]
      (when-not (removable? x)
        (cond
          (map? x) (reduce-kv
                    (fn [m k v]
                      (if-let [new-v (remove v)]
                        (assoc m k new-v)
                        m))
                    map-init
                    x)
          (seq? x) (reduce
                    (fn [acc curr]
                      (if-let [new-curr (remove curr)]
                        (conj acc new-curr)
                        acc))
                    seq-init
                    x)
          :else x)))))

2021-05-20T15:20:50.128400Z

I think clojure.walk/postwalk would make this code much simpler

indy 2021-05-20T15:39:31.129200Z

Yeah, should try it with postwalk

2021-05-20T15:49:44.129400Z

also you might consider a multimethod / some multimethods on type, rather than inline conditionals everywhere

2021-05-20T15:50:40.129600Z

that way, to understand what is done with a type I can look at its method(s) instead of finding the relevaant line in each condition

indy 2021-05-20T15:54:02.129900Z

Yup makes sense. That way I could easily extend it too

em 2021-05-19T22:17:59.090Z

Would love to hear some thoughts on the blocking aspects of blockchain with regard to general programming languages or databases - a lot of projects seem to try to do this, perhaps like the recently discussed Fluree DB on reddit https://github.com/fluree/db

indy 2021-05-19T22:20:52.094700Z

Maybe this isn’t even close to the way I should be going about solving this problem, in which case, please suggest what you think might be a better approach

Dominic Pearson 2021-05-19T22:24:06.097900Z

hi! i just spent several hours trying to track down a really strange bug with some of my code. doing an end-to-end splitting of a file, FEC, encrypt, persist to db, write header, then roundtrip back the other way, but while i read from a file initially, to reduce my initial code, instead i just read the whole thing into memory to do the compare (blake2b hash on source and end result) finally managed to track the culprit down after adding logging to my all-nighter mess of a personal codebase 😄

enki.buffers&gt; (byte-array 3145728000)
Execution error (NegativeArraySizeException) at enki.buffers/eval43087 (form-init18270525509685357804.clj:12).
-1149239296
enki.buffers&gt; (. clojure.lang.Numbers byte_array 3145728000)
Execution error (NegativeArraySizeException) at enki.buffers/eval43089 (form-init18270525509685357804.clj:15).
-1149239296
from: https://github.com/clojure/clojure/blob/b1b88dd25373a86e41310a525a21b497799dbbf2/src/jvm/clojure/lang/Numbers.java#L1394
@WarnBoxedMath(false)
static public byte[] byte_array(Object sizeOrSeq){
	if(sizeOrSeq instanceof Number)
		return new byte[((Number) sizeOrSeq).intValue()];
obviously the issue is 3145728000 > integer max size, so it's overflowing.
(defn byte-array
  "Creates an array of bytes"
  {:inline (fn [&amp; args] `(. clojure.lang.Numbers byte_array ~@args))
   :inline-arities #{1 2}
   :added "1.1"}
  ([size-or-seq] (. clojure.lang.Numbers byte_array size-or-seq))
  ([size init-val-or-seq] (. clojure.lang.Numbers byte_array size init-val-or-seq)))
there's nothing obvious in the docstring nor warnings on clojuredocs about max size for byte-arrays. is this a JVM limitation? (i know it's extremely bad practice, but it was the quick-and-dirty way to test my functionality and i have plenty of RAM. i'll of course rewrite it to use some other method.) maybe at least the docstring should be modified, or maybe Numbers.java can be extended, i dunno. what do you think? at least it is a hidden footgun.

2021-05-19T22:30:27.098400Z

it is a jvm limitation

2021-05-19T22:30:43.098800Z

e.g. arrays are indexed by integers

Dominic Pearson 2021-05-19T22:31:28.099200Z

yep, just read up on it

Dominic Pearson 2021-05-19T22:35:29.102Z

obvious when one knows the limitations of the underlying platform, but was a nightmare to discover (as i calculate the array size from a custom binary datastructure and summing block sizes, so assumed i had a mistake somewhere. of course, upon discovering it only blew up > max int, narrowed the scope somewhat...) gone midnight here, but after some sleep i might see if i can add a note somewhere as a suggestion. (64bit sbcl spoiled me.)