[HTML payload içeriği buraya]
35.1 C
Jakarta
Thursday, May 14, 2026

Introducing Amazon MWAA Serverless | AWS Massive Knowledge Weblog


At this time, AWS introduced Amazon Managed Workflows for Apache Airflow (MWAA) Serverless. This can be a new deployment possibility for MWAA that eliminates the operational overhead of managing Apache Airflow environments whereas optimizing prices by way of serverless scaling. This new providing addresses key challenges that knowledge engineers and DevOps groups face when orchestrating workflows: operational scalability, value optimization, and entry administration.

With MWAA Serverless you’ll be able to focus in your workflow logic somewhat than monitoring for provisioned capability. Now you can submit your Airflow workflows for execution on a schedule or on demand, paying just for the precise compute time used throughout every activity’s execution. The service robotically handles all infrastructure scaling in order that your workflows run effectively no matter load.

Past simplified operations, MWAA Serverless introduces an up to date safety mannequin for granular management by way of AWS Id and Entry Administration (IAM). Every workflow can now have its personal IAM permissions, working on a VPC of your selecting so you’ll be able to implement exact safety controls with out creating separate Airflow environments. This strategy considerably reduces safety administration overhead whereas strengthening your safety posture.

On this submit, we reveal methods to use MWAA Serverless to construct and deploy scalable workflow automation options. We stroll by way of sensible examples of making and deploying workflows, establishing observability by way of Amazon CloudWatch, and changing present Apache Airflow DAGs (Directed Acyclic Graphs) to the serverless format. We additionally discover greatest practices for managing serverless workflows and present you methods to implement monitoring and logging.

How does MWAA Serverless work?

MWAA Serverless processes your workflow definitions and executes them effectively in service-managed Airflow environments, robotically scaling sources primarily based on workflow calls for. MWAA Serverless makes use of the Amazon Elastic Container Service (Amazon ECS) executor to run every particular person activity by itself ECS Fargate container, on both your VPC or a service-managed VPC. These containers then talk again to their assigned Airflow cluster utilizing the Airflow 3 Job API.



Determine 1: Amazon MWAA Structure

MWAA Serverless makes use of declarative YAML configuration information primarily based on the favored open supply DAG Manufacturing facility format to boost safety by way of activity isolation. You’ve gotten two choices for creating these workflow definitions:

This declarative strategy gives two key advantages. First, since MWAA Serverless reads workflow definitions from YAML it may decide activity scheduling with out working any workflow code. Second, this permits MWAA Serverless to grant execution permissions solely when duties run, somewhat than requiring broad permissions on the workflow degree. The result’s a safer setting the place activity permissions are exactly scoped and time restricted.

Service concerns for MWAA Serverless

MWAA Serverless has the next limitations that it’s best to think about when deciding between serverless and provisioned MWAA deployments:

  • Operator help
    • MWAA Serverless solely helps operators from the Amazon Supplier Package deal.
    • To execute customized code or scripts, you’ll want to make use of AWS providers, similar to:
  • Person interface
    • MWAA Serverless operates with out utilizing the Airflow internet interface.
    • For workflow monitoring and administration, we offer integration with Amazon CloudWatch and AWS CloudTrail.

Working with MWAA Serverless

Full the next stipulations and steps to make use of MWAA Serverless.

Stipulations

Earlier than you start, confirm you’ve got the next necessities in place:

  • Entry and permissions
    • An AWS account
    • AWS Command Line Interface (AWS CLI) model 2.31.38 or later put in and configured
    • The suitable permissions to create and modify IAM roles and insurance policies, together with the next required IAM permissions:
      • airflow-serverless:CreateWorkflow
      • airflow-serverless:DeleteWorkflow
      • airflow-serverless:GetTaskInstance
      • airflow-serverless:GetWorkflowRun
      • airflow-serverless:ListTaskInstances
      • airflow-serverless:ListWorkflowRuns
      • airflow-serverless:ListWorkflows
      • airflow-serverless:StartWorkflowRun
      • airflow-serverless:UpdateWorkflow
      • iam:CreateRole
      • iam:DeleteRole
      • iam:DeleteRolePolicy
      • iam:GetRole
      • iam:PutRolePolicy
      • iam:UpdateAssumeRolePolicy
      • logs:CreateLogGroup
      • logs:CreateLogStream
      • logs:PutLogEvents
      • airflow:GetEnvironment
      • airflow:ListEnvironments
      • s3:DeleteObject
      • s3:GetObject
      • s3:ListBucket
      • s3:PutObject
      • s3:Sync
    • Entry to an Amazon Digital Non-public Cloud (VPC) with web connectivity
  • Required AWS providers – Along with MWAA Serverless you have to entry to the next AWS providers:
    • Amazon MWAA to entry your present Airflow setting(s)
    • Amazon CloudWatch to view logs
    • Amazon S3 for DAG and YAML file administration
    • AWS IAM to manage permissions
  • Growth setting
  • Further necessities
    • Primary familiarity with Apache Airflow ideas
    • Understanding of YAML syntax
    • Information of AWS CLI instructions

Observe: All through this submit, we use instance values that you simply’ll want to exchange with your personal:

  • Change amzn-s3-demo-bucket together with your S3 bucket title
  • Change 111122223333 together with your AWS account quantity
  • Change us-east-2 together with your AWS Area. MWAA Serverless is on the market in a number of AWS Areas. Test the Checklist of AWS Providers Accessible by Area for present availability.

Creating your first serverless workflow

Let’s begin by defining a easy workflow that will get a listing of S3 objects and writes that record to a file in the identical bucket. Create a brand new file referred to as simple_s3_test.yaml with the next content material:

simples3test:
  dag_id: simples3test
  schedule: 0 0 * * *
  duties:
    list_objects:
      operator: airflow.suppliers.amazon.aws.operators.s3.S3ListOperator
      bucket: 'amzn-s3-demo-bucket'
      prefix: ''
      retries: 0
    create_object_list:
      operator: airflow.suppliers.amazon.aws.operators.s3.S3CreateObjectOperator
      knowledge: ' ts_nodash '
      s3_bucket: 'amzn-s3-demo-bucket'
      s3_key: 'filelist.txt'
      dependencies: [list_objects]

For this workflow to run, you have to create an Execution position that has permissions to record and write to the above bucket. The position additionally must be assumable from MWAA Serverless. The next CLI instructions create this position and its related coverage:

aws iam create-role 
--role-name mwaa-serverless-access-role 
--assume-role-policy-document '{
    "Model": "2012-10-17",
    "Assertion": [
       lower ,
      {
        "Sid": "AllowAirflowServerlessAssumeRole",
        "Effect": "Allow",
        "Principal":  ts_nodash ,
        "Action": "sts:AssumeRole",
        "Condition":  ts_nodash 
      }
    ]
  }'

aws iam put-role-policy 
  --role-name mwaa-serverless-access-role 
  --policy-name mwaa-serverless-policy   
  --policy-document ' ts_nodash '

You then copy your YAML DAG to the identical S3 bucket, and create your workflow primarily based upon the Arn response from the above perform.

aws s3 cp "simple_s3_test.yaml" 
s3://amzn-s3-demo-bucket/yaml/simple_s3_test.yaml

aws mwaa-serverless create-workflow 
--name simple_s3_test 
--definition-s3-location ' decrease ' 
--role-arn arn:aws:iam::111122223333:position/mwaa-serverless-access-role 
--region us-east-2

The output of the final command returns a WorkflowARN worth, which you then use to run the workflow:

aws mwaa-serverless start-workflow-run 
--workflow-arn arn:aws:airflow-serverless:us-east-2:111122223333:workflow/simple_s3_test-abc1234def 
--region us-east-2

The output returns a RunId worth, which you then use to verify the standing of the workflow run that you simply simply executed.

aws mwaa-serverless get-workflow-run 
--workflow-arn arn:aws:airflow-serverless:us-east-2:111122223333:workflow/simple_s3_test-abc1234def 
--run-id ABC123456789def 
--region us-east-2

If you might want to make a change to your YAML, you’ll be able to copy again to S3 and run the update-workflow command.

aws s3 cp "simple_s3_test.yaml" 
s3://amzn-s3-demo-bucket/yaml/simple_s3_test.yaml

aws mwaa-serverless update-workflow 
--workflow-arn arn:aws:airflow-serverless:us-east-2:111122223333:workflow/simple_s3_test-abc1234def 
--definition-s3-location ' decrease ' 
--role-arn arn:aws:iam::111122223333:position/mwaa-serverless-access-role 
--region us-east-2

Changing Python DAGs to YAML format

AWS has printed a conversion device that makes use of the open-source Airflow DAG processor to serialize Python DAGs into YAML DAG manufacturing unit format. To put in, you run the next:

pip3 set up python-to-yaml-dag-converter-mwaa-serverless
dag-converter convert source_dag.py --output output_yaml_folder

For instance, create the next DAG and title it create_s3_objects.py:

from datetime import datetime
from airflow import DAG
from airflow.fashions.param import Param
from airflow.suppliers.amazon.aws.operators.s3 import S3CreateObjectOperator

default_args =  ts_nodash 

dag = DAG(
    'create_s3_objects',
    default_args=default_args,
    description='Create a number of S3 objects in a loop',
    schedule=None
)

# Set variety of information to create
LOOP_COUNT = 3
s3_bucket="md-workflows-mwaa-bucket"
s3_prefix = 'test-files'

# Create a number of S3 objects utilizing loop
last_task=None
for i in vary(1, LOOP_COUNT + 1):  
    create_object = S3CreateObjectOperator(
        task_id=f'create_object_{i}',
        s3_bucket=s3_bucket,
        s3_key=f'{s3_prefix}/{i}.txt',
        knowledge="{{ ds_nodash }}-{ decrease }",
        substitute=True,
        dag=dag
    )
    if last_task:
        last_task >> create_object
    last_task = create_object

Upon getting put in python-to-yaml-dag-converter-mwaa-serverless, you run:

dag-converter convert "/path_to/create_s3_objects.py" --output "/path_to/yaml/"

The place the output will finish with:

YAML validation profitable, no errors discovered

YAML written to /path_to/yaml/create_s3_objects.yaml

And ensuing YAML will appear like:

create_s3_objects:
  dag_id: create_s3_objects
  params: {}
  default_args:
    start_date: '2024-01-01'
    retries: 0
  schedule: None
  duties:
    create_object_1:
      operator: airflow.suppliers.amazon.aws.operators.s3.S3CreateObjectOperator
      aws_conn_id: aws_default
      knowledge: '{{ ds_nodash }}-{ decrease }'
      encrypt: false
      retailers: []
      params: {}
      priority_weight: 1
      substitute: true
      retries: 0
      retry_delay: 300.0
      retry_exponential_backoff: false
      s3_bucket: md-workflows-mwaa-bucket
      s3_key: test-files/1.txt
      task_id: create_object_1
      trigger_rule: all_success
      wait_for_downstream: false
      dependencies: []
    create_object_2:
      operator: airflow.suppliers.amazon.aws.operators.s3.S3CreateObjectOperator
      aws_conn_id: aws_default
      knowledge: '{{ ds_nodash }}-{ decrease }'
      encrypt: false
      retailers: []
      params: {}
      priority_weight: 1
      substitute: true
      retries: 0
      retry_delay: 300.0
      retry_exponential_backoff: false
      s3_bucket: md-workflows-mwaa-bucket
      s3_key: test-files/2.txt
      task_id: create_object_2
      trigger_rule: all_success
      wait_for_downstream: false
      dependencies: [create_object_1]
    create_object_3:
      operator: airflow.suppliers.amazon.aws.operators.s3.S3CreateObjectOperator
      aws_conn_id: aws_default
      knowledge: '{{ ds_nodash }}-{ decrease }'
      encrypt: false
      retailers: []
      params: {}
      priority_weight: 1
      substitute: true
      retries: 0
      retry_delay: 300.0
      retry_exponential_backoff: false
      s3_bucket: md-workflows-mwaa-bucket
      s3_key: test-files/3.txt
      task_id: create_object_3
      trigger_rule: all_success
      wait_for_downstream: false
      dependencies: [create_object_2]
  catchup: false
  description: Create a number of S3 objects in a loop
  max_active_runs: 16
  max_active_tasks: 16
  max_consecutive_failed_dag_runs: 0

Observe that, as a result of the YAML conversion is finished after the DAG parsing, the loop that creates the duties is run first and the ensuing static record of duties is written to the YAML doc with their dependencies.

Migrating an MWAA setting’s DAGs to MWAA Serverless

You’ll be able to benefit from a provisioned MWAA setting to develop and check your workflows after which transfer them to serverless to run effectively at scale. Additional, in case your MWAA setting is utilizing suitable MWAA Serverless operators, then you’ll be able to convert the entire setting’s DAGs without delay. Step one is to permit MWAA Serverless to imagine the MWAA Execution position through a belief relationship. This can be a one-time operation for every MWAA Execution position, and might be carried out manually within the IAM console or utilizing an AWS CLI command as follows:

MWAA_ENVIRONMENT_NAME="MyAirflowEnvironment"
MWAA_REGION=us-east-2

MWAA_EXECUTION_ROLE_ARN=$(aws mwaa get-environment --region $MWAA_REGION --name $MWAA_ENVIRONMENT_NAME --query 'Setting.ExecutionRoleArn' --output textual content )
MWAA_EXECUTION_ROLE_NAME=$(echo $MWAA_EXECUTION_ROLE_ARN | xargs basename) 
MWAA_EXECUTION_ROLE_POLICY=$(aws iam get-role --role-name $MWAA_EXECUTION_ROLE_NAME --query 'Function.AssumeRolePolicyDocument' --output json | jq '.Assertion[0].Principal.Service += ["airflow-serverless.amazonaws.com"] | .Assertion[0].Principal.Service |= distinctive | .Assertion += [{"Sid": "AllowAirflowServerlessAssumeRole", "Effect": "Allow", "Principal": {"Service": "airflow-serverless.amazonaws.com"}, "Action": "sts:AssumeRole", "Condition": {"StringEquals": {"aws:SourceAccount": "${aws:PrincipalAccount}"}, "ArnLike": {"aws:SourceArn": "arn:aws:*:*:${aws:PrincipalAccount}:workflow/*"}}}]')

aws iam update-assume-role-policy --role-name $MWAA_EXECUTION_ROLE_NAME --policy-document "$MWAA_EXECUTION_ROLE_POLICY"

Now we will loop by way of every efficiently transformed DAG and create serverless workflows for every.

S3_BUCKET=$(aws mwaa get-environment --name $MWAA_ENVIRONMENT_NAME --query 'Setting.SourceBucketArn' --output textual content --region us-east-2 | minimize -d':' -f6)

for file in /tmp/yaml/*.yaml; do MWAA_WORKFLOW_NAME=$(basename "$file" .yaml); 
      aws s3 cp "$file" s3://$S3_BUCKET/yaml/$MWAA_WORKFLOW_NAME.yaml --region us-east-2; 
      aws mwaa-serverless create-workflow --name $MWAA_WORKFLOW_NAME 
      --definition-s3-location "{"Bucket": "$S3_BUCKET", "ObjectKey": "yaml/$MWAA_WORKFLOW_NAME.yaml"}" --role-arn $MWAA_EXECUTION_ROLE_ARN  
      --region us-east-2  
      executed

To see a listing of your created workflows, run:

aws mwaa-serverless list-workflows --region us-east-2

Monitoring and observability

MWAA Serverless workflow execution standing is returned through the GetWorkflowRun perform. The outcomes from that can return particulars for that individual run. If there are errors within the workflow definition, they’re returned below RunDetail within the ErrorMessage discipline as within the following instance:

{
  "WorkflowVersion": "7bcd36ce4d42f5cf23bfee67a0f816c6",
  "RunId": "d58cxqdClpTVjeN",
  "RunType": "SCHEDULE",
  "RunDetail": {
    "ModifiedAt": "2025-11-03T08:02:47.625851+00:00",
    "ErrorMessage": "anticipated token ',', obtained 'create_test_table'",
    "TaskInstances": [],
    "RunState": "FAILED"
  }
}

Workflows which might be correctly outlined, however whose duties fail, will return "ErrorMessage": "Workflow execution failed":

{
  "WorkflowVersion": "0ad517eb5e33deca45a2514c0569079d",
  "RunId": "ABC123456789def",
  "RunType": "SCHEDULE",
  "RunDetail": {
    "StartedOn": "2025-11-03T13:12:09.904466+00:00",
    "CompletedOn": "2025-11-03T13:13:57.620605+00:00",
    "ModifiedAt": "2025-11-03T13:16:08.888182+00:00",
    "Length": 107,
    "ErrorMessage": "Workflow execution failed",
    "TaskInstances": [
      "ex_5496697b-900d-4008-8d6f-5e43767d6e36_create_bucket_1"
    ],
    "RunState": "FAILED"
  },
}

MWAA Serverless activity logs are saved within the CloudWatch log group /aws/mwaa-serverless/<workflow id>/ (the place /<workflow id> is identical string because the distinctive workflow id within the ARN of the workflow). For particular activity log streams, you have to to record the duties for the workflow run after which get every activity’s info. You’ll be able to mix these operations right into a single CLI command.

aws mwaa-serverless list-task-instances 
  --workflow-arn arn:aws:airflow-serverless:us-east-2:111122223333:workflow/simple_s3_test-abc1234def 
  --run-id ABC123456789def 
  --region us-east-2 
  --query 'TaskInstances[].TaskInstanceId' 
  --output textual content | xargs -n 1 -I {} aws mwaa-serverless get-task-instance 
  --workflow-arn arn:aws:airflow-serverless:us-east-2:111122223333:workflow/simple_s3_test-abc1234def 
  --run-id ABC123456789def 
  --task-instance-id {} 
  --region us-east-2 
  --query '{Standing: Standing, StartedAt: StartedAt, LogStream: LogStream}'

Which might outcome within the following:

{
    "Standing": "SUCCESS",
    "StartedAt": "2025-10-28T21:21:31.753447+00:00",
    "LogStream": "//aws/mwaa-serverless/simple_s3_test_3-abc1234def//workflow_id=simple_s3_test-abc1234def/run_id=ABC123456789def/task_id=list_objects/try=1.log"
}
{
    "Standing": "FAILED",
    "StartedAt": "2025-10-28T21:23:13.446256+00:00",
    "LogStream": "//aws/mwaa-serverless/simple_s3_test_3-abc1234def//workflow_id=simple_s3_test-abc1234def/run_id=ABC123456789def/task_id=create_object_list/try=1.log"
}

At which level, you’d use the CloudWatch LogStream output to debug your workflow.

Chances are you’ll view and handle your workflows within the Amazon MWAA Serverless console:

For an instance that creates detailed metrics and monitoring dashboard utilizing AWS Lambda, Amazon CloudWatch, Amazon DynamoDB, and Amazon EventBridge, assessment the instance in this GitHub repository.

Clear up sources

To keep away from incurring ongoing prices, observe these steps to wash up all sources created throughout this tutorial:

  1. Delete MWAA Serverless workflows – Run this AWS CLI command to delete all workflows:
    aws mwaa-serverless list-workflows --query 'Workflows[*].WorkflowArn' --output textual content | whereas learn -r workflow; do aws mwaa-serverless delete-workflow --workflow-arn $workflow executed

  2. Take away the IAM roles and insurance policies created for this tutorial:
    aws iam delete-role-policy --role-name mwaa-serverless-access-role --policy-name mwaa-serverless-policy

  3. Take away the YAML workflow definitions out of your S3 bucket:
    aws s3 rm s3://amzn-s3-demo-bucket/yaml/ --recursive

After finishing these steps, confirm within the AWS Administration Console that each one sources have been correctly eliminated. Keep in mind that CloudWatch Logs are retained by default and will must be deleted individually if you wish to take away all traces of your workflow executions.

If you happen to encounter any errors throughout cleanup, confirm you’ve got the mandatory permissions and that sources exist earlier than making an attempt to delete them. Some sources could have dependencies that require them to be deleted in a particular order.

Conclusion

On this submit, we explored Amazon MWAA Serverless, a brand new deployment possibility that simplifies Apache Airflow workflow administration. We demonstrated methods to create workflows utilizing YAML definitions, convert present Python DAGs to the serverless format, and monitor your workflows.

MWAA Serverless provides a number of key benefits:

  • No provisioning overhead
  • Pay-per-use pricing mannequin
  • Automated scaling primarily based on workflow calls for
  • Enhanced safety by way of granular IAM permissions
  • Simplified workflow definitions utilizing YAML

To be taught extra MWAA Serverless, assessment the documentation.


Concerning the authors

John Jackson

John Jackson

John has over 25 years of software program expertise as a developer, methods architect, and product supervisor in each startups and huge firms and is the AWS Principal Product Supervisor answerable for Amazon MWAA.

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles