beginners

Getting started with Clojure/ClojureScript? Welcome! Also try: https://ask.clojure.org. Check out resources at https://gist.github.com/yogthos/be323be0361c589570a6da4ccc85f58f.
practicalli-john 2020-10-25T02:42:49.470900Z

If you can change op to a hash-map then it makes the code far simpler

(def operands {"+" + "-" - "*" * "/" /}
((operands "+") 4 3)

2020-10-25T10:17:39.477800Z

I have a question about spec. I'd like to check whether infinite lazy seq is valid. But this code is infinite loop. What should I do?

(s/valid? (s/coll-of int?) (range))

schmee 2020-10-25T10:18:15.478500Z

how could you check an infinite sequence? you’d have to check every element, which by definition would take an infinite amount of time

2020-10-25T10:28:16.479800Z

I just want to check 100 of them. So, I've used it like this so far, but it's too messy.

(s/def ::coll #(valid? (s/coll-of int?) (take 100 %)))

(s/fdef func
  :args (s/cat :coll ::coll)
  :ret ::coll)

alexmiller 2020-10-25T13:25:29.481200Z

s/every and s/every-kv do this already

alexmiller 2020-10-25T13:26:12.482300Z

They will check a bounded sample (up to 100 by default)

alexmiller 2020-10-25T13:27:39.483900Z

So just (s/def ::coll (s/every int?)) is exactly same as above (actually better for gen etc)

Marek Jovic 2020-10-25T17:06:38.485800Z

Hello, how to flatten vector of vectors of maps to have vector of those maps?

2020-10-25T17:15:13.486500Z

This sample REPL session has integers instead of maps, but the code works regardless of whether the bottom thing is maps or any other type:

user=> (mapcat identity [[1 2] [3 4] [5 6]])
(1 2 3 4 5 6)
user=> (vec (mapcat identity [[1 2] [3 4] [5 6]]))
[1 2 3 4 5 6]

👍 1
jsn 2020-10-25T17:46:53.487500Z

apply concat does that

dumrat 2020-10-25T18:18:09.488300Z

cljs::formula.events=> (int (char 97)) ==> 0 I'm having issues converting char to int on my cljs repl. This should not result in 0 right?

dpsutton 2020-10-25T18:24:34.488800Z

(int "a")
WARNING: cljs.core/bit-or, all arguments must be numbers, got [string number] instead at line 1 <cljs repl>
0
> (doc of int) Coerce to int by stripping decimal places.

2020-10-25T18:53:58.489200Z

(yet another clj and cljs difference)

dpsutton 2020-10-25T19:08:04.490Z

This is more a host language difference or vm difference to me

dpsutton 2020-10-25T19:08:33.490700Z

There is no char type in js. And on the jvm chars exist and are ints

2020-10-25T19:21:08.490900Z

they are not ints

dpsutton 2020-10-25T19:24:01.491400Z

> An int value represents all Unicode code points, including supplementary code points.

dpsutton 2020-10-25T19:24:06.491700Z

Is what I’m going from

2020-10-25T19:53:56.492700Z

The JVM type system has distinct types for char and short, even though the values of those two types have a one-to-one correspondence to each other.

2020-10-25T20:01:18.492800Z

on that note, how does it store chars with codepoints >64k. I know that Java uses UTF-16 encoding, but I guess you can’t store an (f.e.) emoji in a char ?

2020-10-25T20:03:04.493Z

(more, just bewonderment than a real problem I’m struggling with :p)

alexmiller 2020-10-25T20:04:11.493200Z

it uses multiple chars

alexmiller 2020-10-25T20:05:14.493400Z

there are apis that understand this (and some older ones that do not, so some care is required)

2020-10-25T20:07:34.493600Z

I haven't read this full wikipedia page on utf-16 to see how good of a job it does explaining this, but there is a range of 16-bit values called "surrogates" in UTF-16 encoding of Unicode, such that a pair of surrogates can represent all Unicode code points that do not fit in a single 16-bit value: https://en.wikipedia.org/wiki/UTF-16

2020-10-25T20:12:14.494100Z

Yeah right. Reading up on it (and what Alex says), if you charAt a char requiring 4 bytes to represent, you get half of it. Thanks 🙂!

alexmiller 2020-10-25T20:14:57.494300Z

the codepoint apis understand that stuff

2020-10-25T20:15:31.494500Z

Wondering how that came to be, but apparently surrogate pairs were not a thing when Java was first released. The more you know!

alexmiller 2020-10-25T20:16:22.494700Z

yeah, wasn't introduced until much later

2020-10-25T20:33:26.494900Z

My understanding is that when Unicode first started, they thought 16 bits would be enough for all code points. Java and a few other software systems started with UTF-16 with no need for surrogate pairs, then later added them when 16 bits was no longer enough.

2020-10-25T20:36:09.495100Z

The "History" section of the Wikipedia article on UTF-16 summarizes a few main points of who did what, although not necessarily exactly when (although the cited references might)

2020-10-25T20:50:52.495500Z

The history of computing is often fascinating 🙂 E.g.: we tried x, that turned out to be an oversimplification, then y and z — also an oversimplification, so now we started over with a, with little bits of legacy x, y and z in there.

2020-10-25T22:36:14.496Z

That's interesting. I thought it was the same thing because s/coll-of called s/every. I didn't know there was a ::conform-all in coll-of. Thanks