morning
morning
non clojure specific question: What do you guys use for advanced reporting (medium data, not big data)? do you build reports from scratch with each new system? do you dump your data to a separate denormalised db? use a 3rd party tools and just plugin ur queries?
Morning
@afhammad: my $client dumps all their data into Big Query
like, all their data.
@xlevus: does it have its own reporting UI?
nah. But they build queries to report on it daily.
which are run/export into google docs
it's not the prettiest. But it allows them to do other things, like run their marketing poo off bigquery too
interesting, will look into it, thanks
I've also found it helps for debugging. Shit goes wrong, you can spend a day drudging through terrabytes of data working out why
It's probably closer to the 'big' size of data. Some of the tables are in the tens of millions of rows a month area
Yeh not quite at that level, more so in the hundreds of thousands. I have a couple of clients (currently running on Rails) that are approaching the need for more advanced reporting, im looking for options to decouple it from their systems and kill a few birds with one stone.
@afhammad: i've used elasticsearch to great effect - the aggregations framework is great for building fast analytic queries - responses in <100ms over tens of millions of docs
@mccraigmccraig: thanks. I’ve used elasticsearch briefly and its one of the options i’m considering.
Amazon redshift seems pretty good