Yeah, I had an extremely plebian meaning in mind ---transforming equations for math homework. LaTTe looks amazing though, thank you for linking me to it. The closest I found is a Little Java (I suspect actually clojure) App That Could on Sourceforge, the only review quite derisive. I think there's the core of something great though: https://sourceforge.net/projects/ket/
I used to use wolfram alpha (https://www.wolframalpha.com/) for math homework in college. you just type in the math and it usually brings up any relevant derivations on its own
your school may even have [free] logins/licenses
In documentation for java libraries, frequently a classname is mentioned but without mentioning the package or path of that class and I end up having to just google the class name or look at source code until I find a better way to import. Is there a better way?
I try to find the Java doc page and ctrl-f in there. Itβs super annoying though.
to any people here writing stuff on Rust ... how do you keep your projects maintained and alive over time ?
i tried writing a small tool with rust for postgresql .... and the postgresql client lib pulled in 70 dependencies on it's own. bloody hell π
looks like npm hell all over again
sounds like i will write my tooling with clojure instead and opt for graalvm to get a binary should there be a need for one ...
Take a look at https://github.com/leafclick/pgmig for a Postgresql CLI (migrations) using GraalVM :)
Also you can do scripting with babashka and https://github.com/babashka/babashka-sql-pods/ should you need fast startup without compiling yourself
startup time won't be an issue
the target will be to port a few terabytes of data from a postgres cluster to a yugabyte cluster and keeping active replication going after the initial data has been copied
I think the thing in Rust is that they want to push as much of the language into libs so they can evolve outside of the core
so you will quickly need libs for basic stuff like async, a global mutable var, etc
as for db migration tools .... amazing , yet another one that thinks that you want to manage individual migrations as changes from shape A to shape B
if you think of your database structure as just ... well structure definition, it suddenly becomes code .... we don't manage our code like we do mange our db migrations do we ?
i mean you don't change the code by creating
migration_2342042042
add line (def x "fobar") into babashka.clj
first off it's impossible to read the thing even after a few migrations and it's impossible to reason about it if you have 50 migrations
i know there are a ton of tools out there that maintain db structures like that, but they are all flawed by the design, nobody wants to have 800 migrations for a 6 year old project
We have been using it on a 5 year old project, it works well for us. YMMV.
well you probably run less db migrations π
I will count them
because applying 800 migrations just to run your db tests .... π
or making any sense from the last 40 ...
85
we don't do tests like that. we only test against real systems :P
if you would want to compare the state from 5 months ago to current in your babashka code, you would open up git and diff
maybe individual changes sometimes too, but mostly just the big diff from now to then
but for some reason the quite fewliners db definitions are a holy grail and are treated totally differently
i have seen db management tools that work differently and treat db structure as code. feels way more native once you have touched such
I guess you could write a more intelligent tool by comparing the columns etc that are already there and create those that aren't there yet, but for some reason I didn't come across that. Are you using one?
was using one back at skype. not using at the current company, i haven't found a very good open source one.
some are based ontop of the complex systems like django or rails
but if you dont use django or rails they become aliens in your world π
i will work towards getting a tooling out of my boys to create a similar tool for rabbitmq though
because i face the same problem there
a lot of structure in the exchanges & queues & bindings
and migrations from shape A to shape B are quite risky after a while
and hard to reason about
we're using rabbitmq as well, but not a lot of churn there
i mean civil people would like to test their stuff right π
and compare different stages of their changes if needed
and clean up stuff once they dont need it
Maybe one reason such an open source tool doesn't exist is that it would be very db specific, while migratus works pretty much with any type of persistence thing
well both postgresql and mysql are big enough really to have their own tooling
the "works across databases" often leads to a horribly inefficient design or just a bunch of lies π
because a few steps onwards from the sql standard things start to differ a lot
and is quite a big overhead for development too π
/end of rant
Btw, speaking of Rust and data replication, we are using ZomboDB, written in Rust, to sync data from our postgres to an ES cluster. Works well.
i expect most stuff written in rust to work well, the design of the language itself is very neat.
but on the perhaps unfair example from npm world, if your project starts to depend on ~100 packages then after 3-4 years a good dozen if not more of those will be unmaintained & rusting in the corner, no pun intended.
if i grab clojure & jdbc driver & graal - i have a pretty small pack of things that are maintained and kept up to date
have at least to be thankful for the tokio stack in rust, that seems to keep a big part of their world in a line
agreed. the approach that zomdodb takes is that it lives inside postgresql, as an extension
regarding ES clusters .... it's interesting to see what eventually comes out from the elastic vs aws battle
that turned into quite a sh*tshow π
I have an always up to date ddl sql file that I run tests against, and the migrations are just some Clojure functions that I consider temporary. Not ideal and need a lot of preconditions and postconditions but I will add those when Postgres becomes a single source of truth (right now mongo is that, so Postgres is used for analytics and stuff)
Austin and Texas people in general. Hope all is going ok for you. I'm in Louisiana just down i-10. please feel free to ping me if i can help with anything at all.
Hihi, mongo as the single source of truth.