low code jpeg

How does the batch size affect autoscaling?

AWS By Aug 26, 2023 No Comments

When triggering AWS Lambda functions from an event source like Amazon SQS, the batch size parameter can significantly impact how Lambda autoscales your functions. Here are the key things to know:

  • Batch size refers to the maximum number of messages that Lambda will retrieve from the queue and pass to your function as a single invoke request.
  • A larger batch size means fewer invokes and thus fewer Lambda scaling events. This can reduce costs and latency compared to invoking Lambda for each message.
  • However, each Lambda function invocation has some overhead for initializing a new execution environment. So a very large batch size can actually increase latency if it takes too long to process the entire batch.
  • A good batch size aims for a balance between utilizing Lambda’s parallel processing and avoiding excessive initialization overhead. You’ll need to experiment for your specific use case.
  • Lambda autoscaling is based on the number of concurrent invocations, not requests per second. So using a batch size can artificially reduce the number of concurrent invocations and thus limit how much Lambda autoscales your function.
  • To compensate, you can configure a higher maximum scaling limit for your function if you’re using a large batch size. This ensures Lambda has enough capacity to process all the messages in a timely manner.

In summary, using a batch size with Lambda + SQS can:

Sources

  1. https://stackoverflow.com/questions/68267206/what-does-batch-size-exactly-mean-in-case-of-aws-lambda
  2. https://docs.aws.amazon.com/lambda/latest/operatorguide/scaling-concurrency.html
  3. https://docs.aws.amazon.com/lambda/latest/dg/lambda-concurrency.html
Author

I'm Abhay Singh, an Architect with 9 Years of It experience. AWS Certified Solutions Architect.

No Comments

Leave a comment

Your email address will not be published. Required fields are marked *