Difference between Containers (Kubernetes)  and Serverless (Lambda)

AWS – Difference between Containers (Kubernetes)  and Serverless (Lambda) 

Blog By Jun 26, 2022 No Comments

AWS – Difference between Containers (Kubernetes)  and Serverless (Lambda)

Comparisons: Serverless vs Containers (Lambda vs Kubernetes). How to choose? 

Serverless and containers have some high-level similarities. They eliminate complexity and make it easier to deploy and scale applications. 

Serverless works well if you need to perform relatively simple processing of events without maintaining underlying infrastructure. Containers are the ideal choice if you need full control over the application hosting environment. 

What is Serverless?  

Serverless computing allows you to build and run applications and services without thinking about servers. Serverless applications don’t require you to provision, scale, and manage any servers. You can build them for nearly any type of application or backend service, and everything required to run and scale your application with high availability is handled for you. 

Building serverless applications means that you can focus on your core product instead of worrying about managing and operating servers or runtimes. You are only responsible for providing the serverless function and are not aware of the underlying compute resources. The serverless runtime provisions server resources automatically, and customers are billed according to the number of times and the duration their function actually ran. Serverless is a model of computing that runs code on-demand without need to Provision or manage infrastructure. Development teams simply deploy their code on a serverless platform and are only charged when that code runs and consumes server resources. 

What are Containers? 

Containers provide a standard way to package yours application’s code, configurations, and dependencies into a single object. Containers share an operating system installed on the server and run as resource-isolated processes, ensuring quick, reliable, and consistent deployments, regardless of environment. 

Containers are lightweight and provide a consistent, portable software environment for applications to easily run and scale anywhere. Containers make it easier to manage your underlying infrastructure, whether on-premises or in the cloud, so you can focus on innovation and your business needs. 

Container orchestration (such as Kubernetes) automates the scheduling, development, networking, scaling, health monitoring, and management of your containers. 

Similarities: Serverless (Lambda) and Containers (Kubernetes) 

Serverless and containers are not identical technologies. However, they provide certain overlapping functionalities: 

  • Both are more efficient than virtual machines. 
  • Both allow you to deploy application mode. 
  • Both abstract applications away from the host environment. 

Key Differences: Serverless (lambda) vs Containers (Kubernetes).

General Difference  

Serverless

  • No servers to provision or manage. 
  • Automatically scale with usage. 
  • Pay as you use (never pay idle).
  • Highly available. 

Container

  • Standard unit of software that package up the code and all its dependencies. 
  • Application runs quickly reliably from one computing environment to another. 

Environment

Serverless

  • Underlying infrastructure managed by cloud provider. You cannot choose infrastructure on your own. 
  • No Patching headache. 
  • Can’t install software (e.g. WebServer, AppServer, Custom Software) in the underlying environment.

Container  

  • Complete control of environment. You cannot underlying infrastructure configuration such as VM size, OS, AMI, etc. 
  • Requires management (eg. patching, updates) and orchestration. 
  • Install almost any software. 
  • Prepackaged images with different software available. 

Resource Configuration

Serverless

  • Allocate memory and CPU proportionally.
  • Choose memory from 128 MB to 10 GB. 
  • CPU cores are allocated proportionally between 1 to 6 cores. 
  • Limited deployment package size: 250 MB – unzipped. 50 MB – zipped. 
  • No attached hard disk.

Container 

  • You can configure memory and CPU allocation as per need. 
  • Can choose the underlying EC2 instance type appropriate to workload. 

AWS Lambda supports Container image. 

Scaling

Serverless

  • No Scaling configuration is required. Scales automatically. 
  • Usually, each request connection invokes a new Lambda instance. 
  • Control using concurrency limit. 

Container

  • One pod can serve more than one connection. 
  • Configure scaling using HPA (Horizontal pod Autoscaling). 

Cost

Serverless

  • WIth serverless computing, you only pay for what you use (Pay-as-you-use). 
  • For highly burstable workloads, serverless can lead to significant cost savings. For workloads with consistent demand, serverless may not make much difference (or sometimes it is more costlier). 

Container

  • With containers, you pay for them as long as they’re on (Pay-as-you-go). Containers are constantly running in most cases. 

AWS Fargate Serverless compute for containers. 

Run Duration

Serverless 

  • Serverless functions typically run for a short period of time (Minutes or Seconds) and are shut down as soon as they finish processing the current event. 
  • Maximum runtime: 15 minutes (900 seconds). 

Container 

  • No runtime time limit constraints. Containers can run for prolonged periods of time. 

Integration with other services

Serverless

  • (AWS Lambda) Natively integrated with S3, SNS. SQS, and many other AWS services. 

Container

  • (Amazon EKS) integrated with a few AWS services, such as IAM, ALB with help of ingress. 

High Availability  

Serverless

  • Lambda is inherently highly available, out of the box. (Each Lambda deployed on multi-AZ automatically)
  • No need for a load Balancer

Container

  • No need to ensure high availability. 

Logging and Monitoring 

Serverless

  • Integrated with CloudWatch for monitoring. 
  • Logs go to CloudWatch out of the box. 
  • Logs can be sent to other logging systems using Lambda extensions. 

Container

  • (For Kubernetes) Fluentbit/Fluentd can be used to send logs to CloudWatch, S3, ES, Splunk, Datadog, etc. For monitoring, you can install an agent to work with prometheus, Grafana, etc.

Profitability/Vendor lock-in 

Serverless

  • With serverless, you are highly dependent on the platform that runs your code. 

Container

  • Containers can run anywhere. You just need container runtime installed. 

For example, using AWS Lambda functions makes an app more dependent on the AWS platform, while with Docker containers can be deployed on any platform that can run Docker.  

Supported Languages

Serverless

  • To run an application in a serverless model, the serverless runtime must explicitly support that language (different platforms support different languages). 

Container 

  • Applications can be containerized as long as the underlying host server supports the language they are written in. 

Development and Testability 

Serverless

  • Serverless is more difficult to run outside a cloud environment. You are limited to the cloud platform running functions. 
  • Can’t perform the same level of testing against serverless functions. Local serverless frameworks do exist, but are still complex and not widely adopted. 

Container

  • Containers run the same no matter where they are deployed. You can easily run and test your applications anywhere. 
  • Containers can easily be run in a cloud, local data center or on a developer’s workstation. 

Security 

Serverless 

  • Control what service/api can invoke Lambda using resource policy. 
  • Control what service can Lambda invoke using IAM role. 
  • Multiple AuthN/Z methods with API Gateway
  • Same security group:subnet combo reuse same ENI (IP). 
  • FedRamp (high) compliance. 

Container 

  • Pods support security groups and IAM roles to access other AWS resources. 
  • OPA (Open Policy Agent) to enforce semantic validation of objects during create, update, and delete operations. 
  • Each pod can have its own IP from VPC
  • EKS on EC2, FedRamp in Gov regions. 

Operational complexity 

Serverless

  • Serverless architecture has no backend to manage. There is effectively no infrastructure to manage. 

Container

  • Containers take longer to set up initially than serverless functions because it is necessary to configure system settings, libraries, and so on. It is possible to offload infrastructure management to a provider, but that isn’t always the case. 

Use Cases

When to use Serverless? 

  • If traffic pattern is unpredictable, and want automatically scaling in response to demand. 
  • If you want to process and analyze data at scale but want to avoid infrastructure management and resources your application consumes. 
  • If you don’t want to pay when there is no traffic at all (i.e. you want to go for pay as you use pricing model). 
  • If you want native integration with other AWS services such as SQS, SNS, S3, etc. For example, trigger Lambda function when message added to SQS queue. 
  • You want to develop cloud native applications based on event-driven architectures (especially for green fields apps). 
  • Data processing: Serverless can enable data processing from multiple sources using simple functions. 
  • IoT: Serverless computing provides an event driven and straightforward way for IoT devices and external systems to communicate asynchronously. 

When to use Containers? 

  • If traffic pattern is unpredictable, and want rapid scaling (for sudden spikes) without any concurrency limit constraints.
  • If you want to use the operating system of your own choice and leverage full control over underlying infrastructure. 
  • If you want to use software with specific version requirements (such as Web Server, App server) and third-party softwares/tools, then containers are great to start with. 
  • Faster legacy application migration to the cloud: Containers make it easy to package entire applications and move them to the cloud without needing to make any code changes. 
  • Microservices : Because containers are portable, lightweight, provide process isolation, and easy to deploy, they are an excellent fit for creating loosely coupled microservices. 
  • Batch processing: Package batch processing and ETL jobs into containers to start jobs quickly and scale them dynamically in response to demand. 
  • Machine learning: Use containers to quickly scale machine learning models for training and inference and run them close to your data sources on any platform.
  • Deploy anywhere/Hybrid applications: Containers let you standardize how code is deployed, making it easy to build workflows for applications that run between on-premises and across multiple clouds environments. 
  • CI/CD: Containers provide DevOps teams with a way to eliminate environment differences between dev, QA, staging, and production deployments. As a result, they are highly useful in CI/CD workflows. 

Summary

Containers and serverless computing are two of the most popular methods for deploying applications. Containers work better for some use cases, while is others, serverless is what you need. None is better than the other in an absolute sense, each response to specific needs. Serverless and containers can compensate for each other’s weaknesses. They can coexist and be integrated, as needed, in a single project. 

Author

I'm Abhay Singh, an Architect with 9 Years of It experience. AWS Certified Solutions Architect.

No Comments

Leave a comment

Your email address will not be published. Required fields are marked *