datahike

https://datahike.io/, Join the conversation at https://discord.com/invite/kEBzMvb, history for this channel is available at https://clojurians.zulipchat.com/#narrow/stream/180378-slack-archive/topic/datahike
Ben Sless 2020-06-02T06:44:28.343700Z

I'm trying to implement votes as you suggested. Seems pretty straightforward, but what would you say the best way would be to update a :post/score without recalculating the sum every time?

whilo 2020-06-02T10:45:53.344400Z

@mroerni How do you check whether the file gets updated?

Björn Ebbinghaus 2020-06-03T14:49:46.349500Z

It works like expected, when I do it in my REPL In my project I have the connection in a https://github.com/tolitius/mount defstate and I make the transacts in a https://github.com/wilkerlucio/pathom parallel parser. (note that the first transaction, the schema, is not done in the parallel parser. I’m not sure where the problem is. Since it is in a hobby project I have only little time to investigate, but be sure I come back to you two when I can say more.

whilo 2020-06-03T22:17:07.350600Z

Cool! We are developing pre-configured fulcro components here btw. https://github.com/replikativ/datahike-frontend

Björn Ebbinghaus 2020-06-04T11:32:50.351100Z

@whilo Thats really nice. I use the fulcro-template as well. The project where I have the issue is: https://github.com/MrEbbinghaus/Todoish there is a branch datahike-issue where I experiment with my issue.

whilo 2020-06-04T12:56:16.351600Z

Ok, cool! Unfortunately I do not have the time to look into it now. If you keep having issues feel free to open them on GitHub. It would be good if this turns out to be not a problem though.

Björn Ebbinghaus 2020-06-04T13:55:19.352100Z

I would not expect you to. 🙂 While I have your attention. Are you planing to put every public function in datahike.api? Asking because I use db? and conn? from datahike.core for specs and therefore I am sometimes require only core, only api or both.

Björn Ebbinghaus 2020-06-02T10:46:33.345Z

ls -l & cat

whilo 2020-06-02T10:46:54.345400Z

So the last write time also does not change?

whilo 2020-06-02T10:47:19.345600Z

Is your transaction maybe empty?

Björn Ebbinghaus 2020-06-02T12:43:06.345800Z

I am entering (d/transact conn [#:todoish.models.todo{:id (UUID/randomUUID) :task "My Task!" :done? false}]) in the REPL and getting a tx-report in return… When running: (d/datoms @conn :eavt) the expected atoms are present But the file size and date didn’t change.

Björn Ebbinghaus 2020-06-02T12:47:18.346Z

(:db-after tx-report) Doesn’t print the datoms only the key :schema I guessed this is to not accidentally print the whole DB.. ist this correct?

kkuehne 2020-06-02T12:54:37.346200Z

That's correct. SInce we ran into problems when dealing with larger data sets.

Ben Sless 2020-06-02T17:41:02.347Z

Does datahike have support for tuple binding such as

:where [(tuple ?a ?b) ?tup]

Ben Sless 2020-06-02T17:42:52.347300Z

I see I can cheat using vector

👍 1
kkuehne 2020-06-02T18:14:52.347500Z

As of now, we don't support that. But we can add more distinctive data types if you would like to see that.

Ben Sless 2020-06-02T18:46:08.347700Z

I guess datomic implements tuples in a more lightweight manner. It isn't critical for me atm since it isn't for production purposes My use case is implementing a custom aggregator It would be cool if aggregators like max were pluggable not only for top n but for the comparator, instead of this custom version:

(defn decaying-score
  "Rank tuples of [`score` `creation-time` `current-time`]"
  [[score time now]]
  (/ score (* (- now time) decay-factor)))

(defn compare-tups
  [t1 t2]
  (let [s1 (decaying-score t1)
        s2 (decaying-score t2)]
    (compare s1 s2)))

(defn decaying-max
  ([n coll]
   (vec
    (reduce (fn [acc x]
              (cond
                (< (count acc) n)
                (sort compare-tups (conj acc x))
                (pos? (compare-tups x (first acc)))
                (sort compare-tups (conj (next acc) x))
                :else acc))
            [] coll))))

Ben Sless 2020-06-02T18:46:47.347900Z

decaying-max is just a variation on how 'max is implemented in datahike

whilo 2020-06-02T20:48:41.348100Z

The file size and date should change.

whilo 2020-06-02T20:49:02.348300Z

@konrad.kuehne Can you check that in some setup you have? I am busy atm.

kkuehne 2020-06-02T22:14:55.348500Z

Yes, I'll do that tomorrow.