A typical AWS Batch workload might be triggered by input data being uploaded to S3, this will kick in multiple stages.
Overview
- The Upload event triggers a submission to a job queue (e.g. via lambda function)
- The AWS Batch scheduler runs periodically to check for jobs in the job queue
- Once jobs are placed, the scheduler evaluates the Compute Environments and adjusts the compute capacity to allow the jobs to run.
- The job output is stored external to the job (e.g. S3, EFS or FSx for Lustre)