Hello, everyone how to deal with sharp latency increases during load in pedestal jetty based server? When querying db and sending large json response to 1 request latency is around 700ms, and as i increase requests to 15-25 per second response times grow to 16-27s, how to deal with this? (Standard settings, prod mode, full async and nio)
Is there anything built into pedestal to let the client specify keys? /posts/42?fields=subject,author_name
kind of thing
writing an interceptor but also curious if anything is already baked in
Query params are parsed by default if im not mistaken
/posts/42?fields[]=subject&fields[]=author_name
can you exemplify this?
Did you use time
macro to check if this time is from j/query or any other component?
You are using json? cheshire? how do you benchmark it?
@christian.gonzalez you end up re-writting (a part of) graphql
@souenzzo true 😂
#pathom has a "nested select-keys", that may help you https://blog.wsscode.com/pathom/v2/pathom/2.2.0/other-helpers.html
Muuntaja for encoding, wrk2 for testing
thank you! checking that out, not sure yet if i need to specify nested keys
Also yeah, i used time to look for handler time, it uses fixedthreadpool for queries and does a bit of mapping, the time it takes to complete handler takes 450-550ms
What the request does? Are you sure that your operation isn't blocking the async threads?
Queries are submitted to a fixedsizethreadpool that returns a chan on completion, other than that nothing blocking