Is there a recommended way to be able to call large numbers (more than 64) of lambdas with aws-api
?
Initially I ran into the limit from Cognitect's HttpClient, and I tried creating more AWS lambda clients (suggested in this issue: https://github.com/cognitect-labs/aws-api/issues/98)
However, we've had servers falling over in production because the file descriptor limit is exceeded. Testing at the REPL, it does look like when a client is created for AWS lambda, file descriptors are not released. Should I be doing something to close the client? (Doesn't look like there's a .close
method.) on the aws client.
Might be related to this issue: https://github.com/cognitect-labs/aws-api/issues/109
i ran into the same issue ๐
tried to run 128 lambdas at once and results were failures to execute them ๐
for the leaks of filehandles, doesn't (aws/stop s3)
release them ?
rather unconventional ...
seems to do it for me
small test i ran
@thomas559 i think this should at least help you put out the fire ๐
I canโt type now but weโll systematically fix this
Right there in the README too, I just missed it. "Invoke cognitect.aws.client.api/stop
on the client if you want it to shut down any resources it and its http-client are using."