AWS Batch & AWS Beanstalk
Beanstalk deployment models
-All at Once -Rolling -Rolling with Additional Batch -Immutable -Traffic Split
Types of Environment for AWS Beanstalk
Web Tier Environment and Worker Environment
Amazon Elastic Beanstalk
compute service for deploying and scaling applications (just write code, don't worry about infrastructure)
Worker Environment (Beanstalk)
Beanstalk installs a Daemon which pulls requests from SQS queue and executes tasks
What does AWS Batch execute on?
EC2 (including spot instances) and AWS Fargate
job definitions
define how job is supposed to be run (i.e. associated IAM role, vCPU requirement, container properties)
types of compute environments
managed (user gives min/max vCPU, etc.), and unmanaged (DIY)
job queues
reside in job queue where they wait till they are scheduled
What are the two categories of AWS Batch jobs?
simple or array (i.e. parametric sweeps/monte carlo)
Batch jobs tips
use Fargate and AWS batch generally, but use EC2 if work scale is large
Web Tier Environment (Beanstalk)
various EC2s in a ASG with SGs within an AZ connects to an ELB and the endpoint, also can connect to external DB