Iād like to use SQS for a low-traffic mail-sending queue (say less than 5k emails/day), but both the producers and the consumers would not be running on AWS (yet, they will in 6 months time). Is this a crazy idea or pretty normal?
Normal @orestis
Cool! I see that they say latency is at most low hundred ms so for an email queue that sounds acceptable.
What are valid blob
types I can pass to aws-api?
Ah, the specs are useful š
(s/form :cognitect.aws.s3.PutObjectRequest/Body)
=>
(clojure.spec.alpha/or
:byte-array
clojure.core/bytes?
:input-stream
(clojure.core/fn [%] (clojure.core/instance? java.io.InputStream %)))
we might expand that in the future
specifically to handle larger than memory objects
If my :PutObject
request fails, does aws-api automatically retry the request?
Ah I see - it's set on the client and, optionally, on the individual request. Nice.
Has it been considered to have the client implement Closable?
Also could a client?
function be added?
why the predicate?
Spec
i.e. right now I need to do this:
(s/def ::s3-client #(satisfies? aws-client/ClientSPI %))
Out of curiosity, does anyone know if Azure or GCP expose their APIs via data files like AWS does?
Looks like they do provide swagger JSON files for all their APIs.
If you could transform the AWS API JSONs into the OpenAPI format, it seems like the aws-api concept could work for multiple cloud providers.
Looks like someone has already done this: https://github.com/APIs-guru/openapi-directory/tree/master/APIs/amazonaws.com
there's a lot of little exceptions to the AWS APIs
Kinda figured. Know any examples offhand?
almost all of s3
some apis require extra headers with md5 of body
S3 is the "weird API"
Oh, that stuff can't be documented with openapi?
dunno
it's not in the amazon descriptors, so it wouldn't be in any translation of those descriptors
How does aws-api handle it?
special code that runs on particular requests
there's an private extension point through a multimethod
Ah. You don't think you'd be able to do the same thing with multiple cloud providers?
sure. I'm just trying to say that json descriptors isn't enough for at least AWS
there's particular API domain knowledge necessary, unfortunately
you may want to look at official GCP client sdks and look for these exceptional cases
searching for interceptors, request transformers, stuff like that
It seems like you'd be able to handle that in a similar manner to how aws-api handles "special" apis
yup
Interesting. My company is adding support for additional cloud providers and we really enjoy working with aws-api. Any thoughts on how difficult it'd be to change aws-api to take in OpenAPI JSON instead of the aws-specific JSON files?
personally I wouldn't try to modify, but I'd take the same approach.
input = descriptors ->
output = functions that can assemble and disassemble request/response maps
then you wire the functions with an HTTP client
Too many things coupled to aws in aws-api to try and modify it?
the descriptor transforming code isn't open source
Oh really? Is that non-trivial to do?
but mainly I wouldn't try to modify the client because it's conceptually a simple thing
you get descriptors from google, and you shred them into functions
functions that operate on maps
I'm doing something similar with a native protocol buffer client
Right. The main thing that will differ, I think, is the authentication.
something that takes .proto
files and builds functions from maps -> bytes and bytes -> maps
It almost seems like this version of aws-api that is cloud provider independent is simply an OpenAPI HTTP client.
i don't follow
You could write a library that exposes any OpenAPI spec'ed API in the same format that aws-api takes.