id
stringlengths
8
78
source
stringclasses
743 values
chunk_id
int64
1
5.05k
text
stringlengths
593
49.7k
amazon-mwaa-mg-001
amazon-mwaa-mg.pdf
1
Migration Guide Amazon Managed Workflows for Apache Airflow Copyright © 2025 Amazon Web Services, Inc. and/or its affiliates. All rights reserved. Amazon Managed Workflows for Apache Airflow Migration Guide Amazon Managed Workflows for Apache Airflow: Migration Guide Copyright © 2025 Amazon Web Services, Inc. and/or its affiliates. All rights reserved. Amazon's trademarks and trade dress may not be used in connection with any product or service that is not Amazon's, in any manner that is likely to cause confusion among customers, or in any manner that disparages or discredits Amazon. All other trademarks not owned by Amazon are the property of their respective owners, who may or may not be affiliated with, connected to, or sponsored by Amazon. Amazon Managed Workflows for Apache Airflow Migration Guide Table of Contents What is the migration guide? ......................................................................................................... 1 Network architecture ...................................................................................................................... 2 Amazon MWAA components ...................................................................................................................... 2 Connectivity ................................................................................................................................................... 4 Key considerations ........................................................................................................................... 5 Authentication ............................................................................................................................................... 5 Execution role ................................................................................................................................................ 5 Migrate to a new Amazon MWAA environment ............................................................................ 7 Prerequisites .................................................................................................................................................. 7 Step one: Create a new environment ...................................................................................................... 7 Step two: Migrate your workflow resources ......................................................................................... 14 Step three: exporting the metadata ...................................................................................................... 15 Step four: importing the metadata ....................................................................................................... 17 Next steps .................................................................................................................................................... 19 Related resources ....................................................................................................................................... 20 Migrate workloads from AWS Data Pipeline to Amazon MWAA ................................................. 21 Choosing Amazon MWAA ......................................................................................................................... 21 Architecture and concept mapping ........................................................................................................ 22 Example implementations ........................................................................................................................ 24 Pricing comparison ..................................................................................................................................... 24 Related resources ....................................................................................................................................... 25 Document History .......................................................................................................................... 26 iii Amazon Managed Workflows for Apache Airflow Migration Guide What is the Amazon MWAA migration guide? Amazon Managed Workflows for Apache Airflow is a managed orchestration service for Apache Airflow that allows you to operate data pipelines in the cloud at scale. Amazon MWAA manages the provisioning and ongoing maintenance of Apache Airflow so you no longer need to worry about patching, scaling, or securing instances. Amazon MWAA automatically scales the compute resources that execute tasks to provide consistent performance on demand. Amazon MWAA secures your data by default. Your workloads run in your own isolated and secure cloud environment using Amazon Virtual Private Cloud. This ensures that data is automatically encrypted using AWS Key Management Service. Use this guide to migrate your self-managed Apache Airflow workflows to Amazon MWAA, or upgrade an existing Amazon MWAA environment to a new Apache Airflow version. The migration tutorial describes how you can create, or clone a new Amazon MWAA environment, migrate your workflow resources, and transfer your workflow metadata and logs to your new environment. Before you attempt the migration tutorial, we recommend reviewing the following topics. • Network architecture • Key considerations 1 Amazon Managed Workflows for Apache Airflow Migration Guide Explore Amazon MWAA network architecture The following section describes the main components that make up an Amazon MWAA environment, and the set of AWS services that each environment integrates with to manage its resources, keep your data secure, and provide monitoring and visibility for your workflows. Topics • Amazon MWAA components • Connectivity Amazon MWAA components Amazon MWAA environments consist of the following four main components: 1. Scheduler — Parses and monitors all of your DAGs, and queues tasks for execution when a DAG's dependencies are met. Amazon MWAA deploys the scheduler as a AWS Fargate cluster with a minimum of 2 schedulers. You can increase the scheduler count up to five, depending on your workload. For more information about Amazon MWAA environment classes, see Amazon MWAA environment class. 2. Workers — One or more Fargate tasks that runs your scheduled tasks. The number of workers for your environment is determined by a range between a minimum and maximum number that you specify. Amazon MWAA starts auto-scaling workers when the number of queued and running tasks is more than your existing workers can handle. When running and queued tasks sum to zero for more than two minutes, Amazon MWAA scales back the number of workers to its minimum. For more information about how Amazon MWAA handles auto-scaling workers, see Amazon MWAA automatic scaling. 3. Web server — Runs the Apache Airflow web UI. You can configure the web server with private or public network access. In both cases, access to your Apache Airflow users is controlled by the access control policy you define in AWS Identity and Access Management (IAM). For more information about configuring IAM access policies for your environment, see Accessing an Amazon MWAA environment. 4. Database — Stores metadata about the Apache Airflow environment and your workflows, including DAG run history. The database is a single-tenant Aurora PostgreSQL database managed by AWS, and accessible to the Scheduler and Workers' Fargate containers via a privately-secured Amazon VPC endpoint. Amazon MWAA components 2 Amazon Managed Workflows for Apache Airflow Migration Guide Every Amazon MWAA environment also interacts with a set of AWS
amazon-mwaa-mg-002
amazon-mwaa-mg.pdf
2
users is controlled by the access control policy you define in AWS Identity and Access Management (IAM). For more information about configuring IAM access policies for your environment, see Accessing an Amazon MWAA environment. 4. Database — Stores metadata about the Apache Airflow environment and your workflows, including DAG run history. The database is a single-tenant Aurora PostgreSQL database managed by AWS, and accessible to the Scheduler and Workers' Fargate containers via a privately-secured Amazon VPC endpoint. Amazon MWAA components 2 Amazon Managed Workflows for Apache Airflow Migration Guide Every Amazon MWAA environment also interacts with a set of AWS services to handle a variety of tasks, including storing and accessing DAGs and task dependencies, securing your data at rest, and logging and monitoring you environment. The following diagram demonstrates the different components of an Amazon MWAA environment. Note The service Amazon VPC is not a shared VPC. Amazon MWAA creates an AWS owned VPC for every environment you create. • Amazon S3 — Amazon MWAA stores all of your workflow resources, such as DAGs, requirements, and plugin files in an Amazon S3 bucket. For more information about creating the bucket as part of environment creation, and uploading your Amazon MWAA resources, see Create an Amazon S3 bucket for Amazon MWAA in the Amazon MWAA User Guide. • Amazon SQS — Amazon MWAA uses Amazon SQS for queueing your workflow tasks with a Celery executor. • Amazon ECR — Amazon ECR hosts all Apache Airflow images. Amazon MWAA only supports AWS managed Apache Airflow images. Amazon MWAA components 3 Amazon Managed Workflows for Apache Airflow Migration Guide • AWS KMS — Amazon MWAA uses AWS KMS to ensure your data is secure at rest. By default, Amazon MWAA uses AWS managed AWS KMS keys, but you can configure your environment to use your own customer-managed AWS KMS key. For more information about using your own customer-managed AWS KMS key, see Customer managed keys for Data Encryption in the Amazon MWAA User Guide. • CloudWatch — Amazon MWAA integrates with CloudWatch and delivers Apache Airflow logs and environment metrics to CloudWatch, allowing you to monitor your Amazon MWAA resources and troubleshoot issues. Connectivity Your Amazon MWAA environment needs access to all AWS services it integrates with. The Amazon MWAA execution role controls how access is granted to Amazon MWAA to connect to other AWS services on your behalf. For network connectivity, you can either provide public internet access to your Amazon VPC or create Amazon VPC endpoints. For more information on configuring Amazon VPC endpoints (AWS PrivateLink) for your environment, see Managing access to VPC endpoints on Amazon MWAA in the Amazon MWAA User Guide. Amazon MWAA installs requirements on the scheduler and worker. If your requirements are sourced from a public PyPi repository, your environment needs connectivity to the internet to download the required libraries. For private environments, you can either use a private PyPi repository, or bundle the libraries in .whl files as custom plugins for your environment. When you configure the Apache Airflow in private mode, the Apache Airflow UI can only be accessible to your Amazon VPC though Amazon VPC endpoints. For more information about networking, see Networking in the Amazon MWAA User Guide. Connectivity 4 Amazon Managed Workflows for Apache Airflow Migration Guide Key considerations for migrating to a new MWAA environment Learn more about key considerations, such as authentication and the Amazon MWAA execution role, as you plan to migrate your Apache Airflow workloads to Amazon MWAA. Topics • Authentication • Execution role Authentication Amazon MWAA uses AWS Identity and Access Management (IAM) to control access to the Apache Airflow UI. You must create and manage IAM policies that grant your Apache Airflow users permission to access the web server and manage DAGs. You can manage both authentication and authorization for Apache Airflow's default roles using IAM across different accounts. You can further manage and restrict Apache Airflow users to access only a subset of your workflow DAGs by creating custom Airflow roles and mapping them to your IAM principals. For more information and a step-by-step tutorial, see Tutorial: Restricting an Amazon MWAA user's access to a subset of DAGs. You can also configure federated identities to access Amazon MWAA. For more information see the following. • Amazon MWAA environment with public access — Using Okta as an identity provider with Amazon MWAA on the AWS Compute Blog. • Amazon MWAA environment with private access — Accessing a private Amazon MWAA environment using federated identities. Execution role Amazon MWAA uses an execution role that grants permissions to your environment to access other AWS services. You can provide your workflow with access to AWS services by adding the relevant permissions to the role. If you choose the default option to create a new execution role when
amazon-mwaa-mg-003
amazon-mwaa-mg.pdf
3
identities to access Amazon MWAA. For more information see the following. • Amazon MWAA environment with public access — Using Okta as an identity provider with Amazon MWAA on the AWS Compute Blog. • Amazon MWAA environment with private access — Accessing a private Amazon MWAA environment using federated identities. Execution role Amazon MWAA uses an execution role that grants permissions to your environment to access other AWS services. You can provide your workflow with access to AWS services by adding the relevant permissions to the role. If you choose the default option to create a new execution role when you Authentication 5 Amazon Managed Workflows for Apache Airflow Migration Guide first create the environment, Amazon MWAA attaches the minimal permissions needed to the role, except in the case of CloudWatch Logs for which Amazon MWAA adds all log groups automatically. Once the execution role is created, Amazon MWAA cannot manage its permission policies on your behalf. To update the execution role, you must edit the policy to add and remove permissions as needed. For example, you can integrate your Amazon MWAA environment with AWS Secrets Manager as a backend to securely store secrets and connection strings to use in your Apache Airflow workflows. To do so, attach the following permission policy to your environment's execution role. { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "secretsmanager:GetResourcePolicy", "secretsmanager:GetSecretValue", "secretsmanager:DescribeSecret", "secretsmanager:ListSecretVersionIds" ], "Resource": "arn:aws:secretsmanager:us-west-2:012345678910:secret:*" }, { "Effect": "Allow", "Action": "secretsmanager:ListSecrets", "Resource": "*" } ] } Integrating with other AWS services follows a similar pattern: you add the relevant permission policy to your Amazon MWAA execution role, granting permission to Amazon MWAA to access the service. For more information about managing the Amazon MWAA execution role, and to see additional examples, visit Amazon MWAA execution role in the Amazon MWAA User Guide. Execution role 6 Amazon Managed Workflows for Apache Airflow Migration Guide Migrate to a new Amazon MWAA environment Explore the following steps to migrate your existing Apache Airflow workload to a new Amazon MWAA environment. You can use these steps to migrate from an older version of Amazon MWAA to a new version release, or migrate your self-managed Apache Airflow deployment to Amazon MWAA. This tutorial assumes you are migrating from an existing Apache Airflow v1.10.12 to a new Amazon MWAA running Apache Airflow v2.5.1, but you can use the same procedures to migrate from, or to different Apache Airflow versions. Topics • Prerequisites • Step one: Create a new Amazon MWAA environment running the latest supported Apache Airflow version • Step two: Migrate your workflow resources • Step three: Exporting the metadata from your existing environment • Step four: Importing the metadata to your new environment • Next steps • Related resources Prerequisites To be able to complete the steps and migrate your environment, you'll need the following: • An Apache Airflow deployment. This can be a self-managed or existing Amazon MWAA environment. • Docker installed for your local operating system. • AWS Command Line Interface version 2 installed. Step one: Create a new Amazon MWAA environment running the latest supported Apache Airflow version You can create an environment using the detailed steps in Getting started with Amazon MWAA in the Amazon MWAA User Guide, or by using an AWS CloudFormation template. If you're migrating Prerequisites 7 Amazon Managed Workflows for Apache Airflow Migration Guide from an existing Amazon MWAA environment, and used an AWS CloudFormation template to create your old environment, you can change the AirflowVersion property to specify the new version. MwaaEnvironment: Type: AWS::MWAA::Environment DependsOn: MwaaExecutionPolicy Properties: Name: !Sub "${AWS::StackName}-MwaaEnvironment" SourceBucketArn: !GetAtt EnvironmentBucket.Arn ExecutionRoleArn: !GetAtt MwaaExecutionRole.Arn AirflowVersion: 2.5.1 DagS3Path: dags NetworkConfiguration: SecurityGroupIds: - !GetAtt SecurityGroup.GroupId SubnetIds: - !Ref PrivateSubnet1 - !Ref PrivateSubnet2 WebserverAccessMode: PUBLIC_ONLY MaxWorkers: !Ref MaxWorkerNodes LoggingConfiguration: DagProcessingLogs: LogLevel: !Ref DagProcessingLogs Enabled: true SchedulerLogs: LogLevel: !Ref SchedulerLogsLevel Enabled: true TaskLogs: LogLevel: !Ref TaskLogsLevel Enabled: true WorkerLogs: LogLevel: !Ref WorkerLogsLevel Enabled: true WebserverLogs: LogLevel: !Ref WebserverLogsLevel Enabled: true Alternatively, if migrating from an existing Amazon MWAA environment, you can copy the following Python script that uses the AWS SDK for Python (Boto3) to clone your environment. You can also download the script. Step one: Create a new environment 8 Amazon Managed Workflows for Apache Airflow Migration Guide Python Script # This Python file uses the following encoding: utf-8 ''' Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. SPDX-License-Identifier: MIT-0 Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING
amazon-mwaa-mg-004
amazon-mwaa-mg.pdf
4
Guide Python Script # This Python file uses the following encoding: utf-8 ''' Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. SPDX-License-Identifier: MIT-0 Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ''' from __future__ import print_function import argparse import json import socket import time import re import sys from datetime import timedelta from datetime import datetime import boto3 from botocore.exceptions import ClientError, ProfileNotFound from boto3.session import Session ENV_NAME = "" REGION = "" def verify_boto3(boto3_current_version): ''' check if boto3 version is valid, must be 1.17.80 and up return true if all dependenceis are valid, false otherwise ''' valid_starting_version = '1.17.80' if boto3_current_version == valid_starting_version: return True Step one: Create a new environment 9 Amazon Managed Workflows for Apache Airflow Migration Guide ver1 = boto3_current_version.split('.') ver2 = valid_starting_version.split('.') for i in range(max(len(ver1), len(ver2))): num1 = int(ver1[i]) if i < len(ver1) else 0 num2 = int(ver2[i]) if i < len(ver2) else 0 if num1 > num2: return True elif num1 < num2: return False return False def get_account_id(env_info): ''' Given the environment metadata, fetch the account id from the environment ARN ''' return env_info['Arn'].split(":")[4] def validate_envname(env_name): ''' verify environment name doesn't have path to files or unexpected input ''' if re.match(r"^[a-zA-Z][0-9a-zA-Z-_]*$", env_name): return env_name raise argparse.ArgumentTypeError("%s is an invalid environment name value" % env_name) def validation_region(input_region): ''' verify environment name doesn't have path to files or unexpected input REGION: example is us-east-1 ''' session = Session() mwaa_regions = session.get_available_regions('mwaa') if input_region in mwaa_regions: return input_region raise argparse.ArgumentTypeError("%s is an invalid REGION value" % input_region) def validation_profile(profile_name): ''' Step one: Create a new environment 10 Amazon Managed Workflows for Apache Airflow Migration Guide verify profile name doesn't have path to files or unexpected input ''' if re.match(r"^[a-zA-Z0-9]*$", profile_name): return profile_name raise argparse.ArgumentTypeError("%s is an invalid profile name value" % profile_name) def validation_version(version_name): ''' verify profile name doesn't have path to files or unexpected input ''' if re.match(r"[1-2].\d.\d", version_name): return version_name raise argparse.ArgumentTypeError("%s is an invalid version name value" % version_name) def validation_execution_role(execution_role_arn): ''' verify profile name doesn't have path to files or unexpected input ''' if re.match(r'(?i)\b((?:[a-z][\w-]+:(?:/{1,3}|[a-z0-9%])|www\d{0,3}[.]|[a-z0-9. \-]+[.][a-z]{2,4}/)(?:[^\s()<>]+|\(([^\s()<>]+|(\([^\s()<>]+\)))*\))+(?:\(([^\s()<>]+| (\([^\s()<>]+\)))*\)|[^\s`!()\[\]{};:\'".,<>?«»“”‘’]))', execution_role_arn): return execution_role_arn raise argparse.ArgumentTypeError("%s is an invalid execution role ARN" % execution_role_arn) def create_new_env(env): ''' method to duplicate env ''' mwaa = boto3.client('mwaa', region_name=REGION) print('Source Environment') print(env) if (env['AirflowVersion']=="1.10.12") and (VERSION=="2.2.2"): if env['AirflowConfigurationOptions'] ['secrets.backend']=='airflow.contrib.secrets.aws_secrets_manager.SecretsManagerBackend': print('swapping',env['AirflowConfigurationOptions']['secrets.backend']) env['AirflowConfigurationOptions'] ['secrets.backend']='airflow.providers.amazon.aws.secrets.secrets_manager.SecretsManagerBackend' env['LoggingConfiguration']['DagProcessingLogs'].pop('CloudWatchLogGroupArn') env['LoggingConfiguration']['SchedulerLogs'].pop('CloudWatchLogGroupArn') env['LoggingConfiguration']['TaskLogs'].pop('CloudWatchLogGroupArn') Step one: Create a new environment 11 Amazon Managed Workflows for Apache Airflow Migration Guide env['LoggingConfiguration']['WebserverLogs'].pop('CloudWatchLogGroupArn') env['LoggingConfiguration']['WorkerLogs'].pop('CloudWatchLogGroupArn') env['AirflowVersion']=VERSION env['ExecutionRoleArn']=EXECUTION_ROLE_ARN env['Name']=ENV_NAME_NEW env.pop('Arn') env.pop('CreatedAt') env.pop('LastUpdate') env.pop('ServiceRoleArn') env.pop('Status') env.pop('WebserverUrl') if not env['Tags']: env.pop('Tags') print('Destination Environment') print(env) return mwaa.create_environment(**env) def get_mwaa_env(input_env_name): # https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/ mwaa.html#MWAA.Client.get_environment mwaa = boto3.client('mwaa', region_name=REGION) environment = mwaa.get_environment( Name=input_env_name )['Environment'] return environment def print_err_msg(c_err): '''short method to handle printing an error message if there is one''' print('Error Message: {}'.format(c_err.response['Error']['Message'])) print('Request ID: {}'.format(c_err.response['ResponseMetadata']['RequestId'])) print('Http code: {}'.format(c_err.response['ResponseMetadata']['HTTPStatusCode'])) # # Main # # Usage: # python3 clone_environment.py --envname MySourceEnv --envnamenew MyDestEnv --region us-west-2 --execution_role AmazonMWAA-MyDestEnv-ExecutionRole --version 2.2.2 # # based on https://github.com/awslabs/aws-support-tools/blob/master/MWAA/verify_env/ verify_env.py Step one: Create a new environment 12 Amazon Managed Workflows for Apache Airflow Migration Guide # if __name__ == '__main__': if sys.version_info[0] < 3: print("python2 detected, please use python3. Will try to run anyway") if not verify_boto3(boto3.__version__): print("boto3 version ", boto3.__version__, "is not valid for this script. Need 1.17.80 or higher") print("please run pip install boto3 --upgrade --user") sys.exit(1) parser = argparse.ArgumentParser() parser.add_argument('--envname', type=validate_envname, required=True, help="name of the source MWAA environment") parser.add_argument('--region', type=validation_region, default=boto3.session.Session().region_name, required=False, help="region, Ex: us-east-1") parser.add_argument('--profile', type=validation_profile, default=None, required=False, help="AWS CLI profile, Ex: dev") parser.add_argument('--version', type=validation_version, default="2.2.2", required=False, help="Airflow destination version, Ex: 2.2.2") parser.add_argument('--execution_role', type=validation_execution_role, default=None, required=True, help="New environment execution role ARN, Ex: arn:aws:iam::112233445566:role/service-role/AmazonMWAA-MyEnvironment-ExecutionRole") parser.add_argument('--envnamenew', type=validate_envname, required=True, help="name of the destination MWAA environment") args, _ = parser.parse_known_args() ENV_NAME = args.envname REGION = args.region PROFILE = args.profile VERSION = args.version EXECUTION_ROLE_ARN = args.execution_role ENV_NAME_NEW = args.envnamenew try: print("PROFILE",PROFILE) if PROFILE: boto3.setup_default_session(profile_name=PROFILE) env = get_mwaa_env(ENV_NAME) response = create_new_env(env) print(response) except ClientError as client_error: if client_error.response['Error']['Code'] == 'LimitExceededException': Step one: Create a new environment 13 Amazon Managed Workflows for Apache Airflow Migration Guide print_err_msg(client_error) print('please retry the script') elif client_error.response['Error']['Code'] in ['AccessDeniedException', 'NotAuthorized']: print_err_msg(client_error)
amazon-mwaa-mg-005
amazon-mwaa-mg.pdf
5
profile, Ex: dev") parser.add_argument('--version', type=validation_version, default="2.2.2", required=False, help="Airflow destination version, Ex: 2.2.2") parser.add_argument('--execution_role', type=validation_execution_role, default=None, required=True, help="New environment execution role ARN, Ex: arn:aws:iam::112233445566:role/service-role/AmazonMWAA-MyEnvironment-ExecutionRole") parser.add_argument('--envnamenew', type=validate_envname, required=True, help="name of the destination MWAA environment") args, _ = parser.parse_known_args() ENV_NAME = args.envname REGION = args.region PROFILE = args.profile VERSION = args.version EXECUTION_ROLE_ARN = args.execution_role ENV_NAME_NEW = args.envnamenew try: print("PROFILE",PROFILE) if PROFILE: boto3.setup_default_session(profile_name=PROFILE) env = get_mwaa_env(ENV_NAME) response = create_new_env(env) print(response) except ClientError as client_error: if client_error.response['Error']['Code'] == 'LimitExceededException': Step one: Create a new environment 13 Amazon Managed Workflows for Apache Airflow Migration Guide print_err_msg(client_error) print('please retry the script') elif client_error.response['Error']['Code'] in ['AccessDeniedException', 'NotAuthorized']: print_err_msg(client_error) print('please verify permissions used have permissions documented in readme') elif client_error.response['Error']['Code'] == 'InternalFailure': print_err_msg(client_error) print('please retry the script') else: print_err_msg(client_error) except ProfileNotFound as profile_not_found: print('profile', PROFILE, 'does not exist, please doublecheck the profile name') except IndexError as error: print("Error:", error) Step two: Migrate your workflow resources Apache Airflow v2 is a major version release. If you are migrating from Apache Airflow v1, you must prepare your workflow resources and verify the changes you make to your DAGs, requirements, and plugins. To do so, we recommend configuring a bridge version of Apache Airflow on your local operating system using Docker and the Amazon MWAA local runner. The Amazon MWAA local runner provides a command line interface (CLI) utility that replicates an Amazon MWAA environment locally. Whenever you're changing Apache Airflow versions, ensure that you reference the correct -- constraint URL in your requirements.txt. To migrate your workflow resources 1. Create a fork of the aws-mwaa-local-runner repository, and clone a copy of the Amazon MWAA local runner. 2. Checkout the v1.10.15 branch of the aws-mwaa-local-runner repository. Apache Airflow released v1.10.15 as a bridge release to assist in migrating to Apache Airflow v2, and although Amazon MWAA does not support v1.10.15, you can use the Amazon MWAA local runner to test your resources. Step two: Migrate your workflow resources 14 Amazon Managed Workflows for Apache Airflow Migration Guide 3. Use the Amazon MWAA local runner CLI tool to build the Docker image and run Apache Airflow locally. For more information, see the local runner README in the GitHub repository. 4. Using Apache Airflow running locally, follow the steps described in Upgrading from 1.10 to 2 in the Apache Airflow documentation website. a. To update your requirements.txt, follow the best practices we recommend in Managing Python dependencies, in the Amazon MWAA User Guide. b. If you have bundled your custom operators and sensors with your plugins for your existing Apache Airflow v1.10.12 environment, move them to your DAG folder. For more information on module management best practices for Apache Airflow v2+, see Module Management in the Apache Airflow documentation website. 5. After you have made the required changes to your workflow resources, checkout the v2.5.1 branch of the aws-mwaa-local-runner repository, and test your updated workflow DAGs, requirements, and custom plugins locally. If you're migrating to a different Apache Airflow version, you can use the appropriate local runner branch for your version, instead. 6. After you have successfully tested your workflow resources, copy your DAGs, requirements.txt, and plugins to the Amazon S3 bucket you configured with your new Amazon MWAA environment. Step three: Exporting the metadata from your existing environment Apache Airflow metadata tables such as dag, dag_tag, and dag_code automatically populate when you copy the updated DAG files to your environment's Amazon S3 bucket and the scheduler parses them. Permission related tables also populate automatically based on your IAM execution role permission. You do not need to migrate them. You can migrate data related to DAG history, variable, slot_pool, sla_miss, and if needed, xcom, job, and log tables. Task instance log is stored in the CloudWatch Logs under the airflow-{environment_name} log group. If you want to see the task instance logs for older runs, those logs must be copied over to the new environment log group. We recommend that you move only a few days worth of logs in order to reduce associated costs. If you're migrating from an existing Amazon MWAA environment, there is no direct access to the metadata database. You must run a DAG to export the metadata from your existing Amazon MWAA Step three: exporting the metadata 15 Amazon Managed Workflows for Apache Airflow Migration Guide environment to an Amazon S3 bucket of your choice. The following steps can also be used to export Apache Airflow metadata if you're migrating from a self-managed environment. After the data is exported, you can then run a DAG in your new environment to import the data. During the export and the import process, all other DAGs are paused. To export the metadata from your existing environment 1. Create an Amazon S3 bucket using the AWS CLI to store the exported data. Replace the UUID and region with your information. $ aws s3api create-bucket
amazon-mwaa-mg-006
amazon-mwaa-mg.pdf
6
Workflows for Apache Airflow Migration Guide environment to an Amazon S3 bucket of your choice. The following steps can also be used to export Apache Airflow metadata if you're migrating from a self-managed environment. After the data is exported, you can then run a DAG in your new environment to import the data. During the export and the import process, all other DAGs are paused. To export the metadata from your existing environment 1. Create an Amazon S3 bucket using the AWS CLI to store the exported data. Replace the UUID and region with your information. $ aws s3api create-bucket \ --bucket mwaa-migration-{UUID}\ --region {region} Note If you are migrating sensitive data, such as connections you store in variables, we recommend that you enable default encryption for the Amazon S3 bucket. 2. Note Does not apply to migration from a self-managed environment. Modify the execution role of the existing environment and add the following policy to grant write access to the bucket you created in step one. { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "s3:PutObject*" ], "Resource": [ "arn:aws:s3:::mwaa-migration-{UUID}/*" ] } Step three: exporting the metadata 16 Amazon Managed Workflows for Apache Airflow Migration Guide ] } 3. Clone the amazon-mwaa-examples repository, and navigate to the metadata-migration subdirectory for your migration scenario. $ git clone https://github.com/aws-samples/amazon-mwaa-examples.git $ cd amazon-mwaa-examples/usecases/metadata-migration/existing-version-new-version/ 4. In export_data.py, replace the string value for S3_BUCKET with the Amazon S3 bucket you created to store exported metadata. S3_BUCKET = 'mwaa-migration-{UUID}' 5. Locate the requirements.txt file in the metadata-migration directory. If you already have a requirements file for your existing environment, add the additional requirements specified in requirements.txt to your file. If you do not have an existing requirements file, you can simply use the one provided in the metadata-migration directory. 6. Copy export_data.py to the DAG directory of the Amazon S3 bucket associated with your existing environment. If migrating from a self-managed environment, copy export_data.py to your /dags folder. 7. Copy your updated requirements.txt to the Amazon S3 bucket associated with your existing environment, then edit the environment to specify the new requirements.txt version. 8. After the environment is updated, access the Apache Airflow UI, unpause the db_export DAG, and trigger the workflow to run. 9. Verify that the metadata is exported to data/migration/existing-version_to_new- version/export/ in the mwaa-migration-{UUID} Amazon S3 bucket, with each table in it's own dedicated file. Step four: Importing the metadata to your new environment To import the metadata to your new environment 1. In import_data.py, replace the string values for the following with your information. • For migration from an existing Amazon MWAA environment: Step four: importing the metadata 17 Amazon Managed Workflows for Apache Airflow Migration Guide S3_BUCKET = 'mwaa-migration-{UUID}' OLD_ENV_NAME='{old_environment_name}' NEW_ENV_NAME='{new_environment_name}' TI_LOG_MAX_DAYS = {number_of_days} MAX_DAYS controls how many days worth of log files the workflow copies over to the new environment. • For migration from a self-managed environment: S3_BUCKET = 'mwaa-migration-{UUID}' NEW_ENV_NAME='{new_environment_name}' 2. (Optional) import_data.py copies only failed task logs. If you want to copy all task logs, modify the getDagTasks function, and remove ti.state = 'failed' as shown in the following code snippet. def getDagTasks(): session = settings.Session() dagTasks = session.execute(f"select distinct ti.dag_id, ti.task_id, date(r.execution_date) as ed \ from task_instance ti, dag_run r where r.execution_date > current_date - {TI_LOG_MAX_DAYS} and \ ti.dag_id=r.dag_id and ti.run_id = r.run_id order by ti.dag_id, date(r.execution_date);").fetchall() return dagTasks 3. Modify the execution role of your new environment and add the following policy. The permission policy allows Amazon MWAA to read from the Amazon S3 bucket where you exported the Apache Airflow metadata, and to copy task instance logs from existing log groups. Replace all placeholders with your information. Note If you are migrating from a self-managed environment, you must remove CloudWatch Logs related permissions from the policy. { Step four: importing the metadata 18 Amazon Managed Workflows for Apache Airflow Migration Guide "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "logs:GetLogEvents", "logs:DescribeLogStreams" ], "Resource": [ "arn:aws:logs:{region}:{account_number}:log- group:airflow-{old_environment_name}*" ] }, { "Effect": "Allow", "Action": [ "s3:GetObject", "s3:ListBucket" ], "Resource": [ "arn:aws:s3:::mwaa-migration-{UUID}", "arn:aws:s3:::mwaa-migration-{UUID}/*" ] } ] } 4. Copy import_data.py to the DAG directory of the Amazon S3 bucket associated with your new environment, then access the Apache Airflow UI to unpause the db_import DAG and trigger the workflow. The new DAG will appear in the Apache Airflow UI in a few minutes. 5. After the DAG run completes, verify that your DAG run history is copied over by accessing each individual DAG. Next steps • For more information about available Amazon MWAA environment classes and capabilities, see Amazon MWAA environment class in the Amazon MWAA User Guide. • For more information about how Amazon MWAA handles autoscaling workers, see Amazon MWAA automatic scaling in the Amazon MWAA User Guide. • For more information about the Amazon MWAA REST API, see the Amazon MWAA REST
amazon-mwaa-mg-007
amazon-mwaa-mg.pdf
7
and trigger the workflow. The new DAG will appear in the Apache Airflow UI in a few minutes. 5. After the DAG run completes, verify that your DAG run history is copied over by accessing each individual DAG. Next steps • For more information about available Amazon MWAA environment classes and capabilities, see Amazon MWAA environment class in the Amazon MWAA User Guide. • For more information about how Amazon MWAA handles autoscaling workers, see Amazon MWAA automatic scaling in the Amazon MWAA User Guide. • For more information about the Amazon MWAA REST API, see the Amazon MWAA REST API. Next steps 19 Amazon Managed Workflows for Apache Airflow Migration Guide Related resources • Apache Airflow models (Apache Airflow Documentation) – Learn more about Apache Airflow metadata database models. Related resources 20 Amazon Managed Workflows for Apache Airflow Migration Guide Migrate workloads from AWS Data Pipeline to Amazon MWAA AWS launched the AWS Data Pipeline service in 2012. At that time, customers wanted a service that let them use a variety of compute options to move data between different data sources. As data transfer needs changed over time, so have the solutions to those needs. You now have the option to choose the solution that most closely meets your business requirements. You can migrate your workloads to any of the following AWS services: • Use Amazon Managed Workflows for Apache Airflow (Amazon MWAA) to manage workflow orchestration for Apache Airflow. • Use Step Functions to orchestrate workflows between multiple AWS services. • Use AWS Glue to run and orchestrate Apache Spark applications. The option you choose depends on your current workload on AWS Data Pipeline. This topic explains how to migrate from AWS Data Pipeline to Amazon MWAA. Topics • Choosing Amazon MWAA • Architecture and concept mapping • Example implementations • Pricing comparison • Related resources Choosing Amazon MWAA Amazon Managed Workflows for Apache Airflow (Amazon MWAA) is a managed orchestration service for Apache Airflow that lets you setup and operate end-to-end data pipelines in the cloud at scale. Apache Airflow is an open-source tool used to programmatically author, schedule, and monitor sequences of processes and tasks referred to as workflows. With Amazon MWAA, you can use Apache Airflow and the Python programming language to create workflows without having to manage the underlying infrastructure for scalability, availability, and security. Amazon MWAA automatically scales its workflow capacity to meet your needs, and is integrated with AWS security services to help provide you with fast and secure access to your data. Choosing Amazon MWAA 21 Amazon Managed Workflows for Apache Airflow Migration Guide The following highlights some of the benefits of migrating from AWS Data Pipeline to Amazon MWAA: • Enhanced scalability and performance – Amazon MWAA provides a flexible and scalable framework for defining and executing workflows. This allows users to handle large and complex workflows with ease, and take advantage of features such as dynamic task scheduling, data- driven workflows and parallelism. • Improved monitoring and logging – Amazon MWAA integrates with Amazon CloudWatch to enhance monitoring and logging of your workflows. Amazon MWAA automatically sends system metrics and logs to CloudWatch. This means you can track the progress and performance of your workflows in real-time, and identify any issues that arise. • Better integrations with AWS services and third-party software – Amazon MWAA integrates with a variety of other AWS services, such as Amazon S3, AWS Glue, and Amazon Redshift, as well as third-party software such as DBT, Snowflake, and Databricks. This lets you process, and transfer, data across different environments and services. • Open-source data pipeline tool – Amazon MWAA leverages the same open-source Apache Airflow product you are familiar with. Apache Airflow is a purpose-built tool designed to handle all aspects of data pipeline management, including ingestion, processing, transferring, integrity testing, quality checks, and ensuring data lineage. • Modern and flexible architecture – Amazon MWAA leverages containerization and cloud- native, serverless technologies. This means for more flexibility and portability, as well as easier deployment and management of your workflow environments. Architecture and concept mapping AWS Data Pipeline and Amazon MWAA have different architectures and components, which can affect the migration process and the way workflows are defined and executed. This section overviews architecture and components for both services, and highlights some of the key differences. Both AWS Data Pipeline and Amazon MWAA are fully managed services. When you migrate your workloads to Amazon MWAA you might need to learn new concepts to model your existing workflows using Apache Airflow. However, you will not need to manage infrastructure, patch workers, and manage operating system updates. The following table associates key concepts in AWS Data Pipeline with those in Amazon MWAA. Use this information as a starting point to design a migration plan. Architecture and concept mapping
amazon-mwaa-mg-008
amazon-mwaa-mg.pdf
8
are defined and executed. This section overviews architecture and components for both services, and highlights some of the key differences. Both AWS Data Pipeline and Amazon MWAA are fully managed services. When you migrate your workloads to Amazon MWAA you might need to learn new concepts to model your existing workflows using Apache Airflow. However, you will not need to manage infrastructure, patch workers, and manage operating system updates. The following table associates key concepts in AWS Data Pipeline with those in Amazon MWAA. Use this information as a starting point to design a migration plan. Architecture and concept mapping 22 Amazon Managed Workflows for Apache Airflow Migration Guide Concept AWS Data Pipeline Amazon MWAA Pipeline definition AWS Data Pipeline uses JSON-based configuration file Amazon MWAA uses Python- based Directed Acyclic that defines the workflow. Graphs (DAGs) that define the workflow. Pipeline execution environme nt Workflows run on Amazon EC2 instances. AWS Data Amazon MWAA uses Amazon ECS containerized environme Pipeline components Pipeline execution Pipeline provisions and nts to run tasks. manages these instances on your behalf. Activities are processing tasks that run as part of the Operators (Tasks) are the fundamental processing units workflow. of a workflow. Preconditions contain conditional statements Sensors (Tasks) represent conditional statements that that must be true before an can wait for a resource or activity can run. task to be completed before running. A resource in AWS Data Pipeline refers to the AWS Using tasks in a DAG, you can define a variety of compute compute resource that resources, including Amazon performs the work that a ECS, Amazon EMR, and pipeline activity specifies. Amazon EC2 and Amazon EMR are two available Amazon EKS. Amazon MWAA executes Python operation s on workers that run on resources. Amazon ECS. AWS Data Pipeline supports scheduling runs with regular rate-based, and cron-based patterns. Amazon MWAA supports scheduling with cron expressions and presets, as well as custom timetables. Architecture and concept mapping 23 Amazon Managed Workflows for Apache Airflow Migration Guide Concept AWS Data Pipeline Amazon MWAA An instances refers to each run of the pipeline. A DAG run refers to each run of an Apache Airflow workflow. An attempt refers to a retry of a failed operation. Amazon MWAA supports retries that you define either at the DAG level, or at the task-level. Example implementations In many cases you will be able to re-use resources you are currently orchestrating with AWS Data Pipeline after migrating to Amazon MWAA. The following list contains example implementations using Amazon MWAA for the most common AWS Data Pipeline use-cases. • Running an Amazon EMR job (AWS workshop) • Creating a custom plugin for Apache Hive and Hadoop (Amazon MWAA User Guide) • Copying data from S3 to Redshift (AWS workshop) • Executing a shell script on a remote Amazon ECS instance (Amazon MWAA User Guide) • Orchestrating hybrid (on-prem) workflows (Blog post) For additional tutorials and examples, see the following: • Amazon MWAA tutorials • Amazon MWAA code examples Pricing comparison Pricing for AWS Data Pipeline is based on the number of pipelines, as well as how much you use each pipeline. Activities that you run more than once a day (high frequency) cost $1 per month per activity. Activities that you run once a day or less (low frequency) cost $0.60 per month per activity. Inactive Pipelines are priced at $1 per pipeline. For more information, see the AWS Data Pipeline pricing page. Example implementations 24 Amazon Managed Workflows for Apache Airflow Migration Guide Pricing for Amazon MWAA is based on the duration of time that your managed Apache Airflow environment exists, and any additional auto scaling required to provide more workers, or scheduler capacity. You pay for your Amazon MWAA environment usage on an hourly basis (billed at one- second resolution), with varying fees depending on the size of the environment. Amazon MWAA auto-scales the number of workers based on your environment configuration. AWS calculates the cost of additional workers separately. For more information on the hourly cost of using various Amazon MWAA environment sizes, see the Amazon MWAA pricing page. Related resources For more information and best practices for using Amazon MWAA, see the following resources: • The Amazon MWAA API reference • Monitoring dashboards and alarms on Amazon MWAA • Performance tuning for Apache Airflow on Amazon MWAA Related resources 25 Amazon Managed Workflows for Apache Airflow Migration Guide Amazon MWAA Document History The following table describes important additions to the Amazon MWAA migration guide, beginning in March 2022. Change Description Date New topic on migrating workloads from AWS Data Added new information and guidance on migrating April 14, 2023 Pipeline to Amazon MWAA existing workloads from AWS Amazon MWAA Migration Guide launch March 7, 2022 Data Pipeline to Amazon MWAA. Use this informati on to help you
amazon-mwaa-mg-009
amazon-mwaa-mg.pdf
9
MWAA API reference • Monitoring dashboards and alarms on Amazon MWAA • Performance tuning for Apache Airflow on Amazon MWAA Related resources 25 Amazon Managed Workflows for Apache Airflow Migration Guide Amazon MWAA Document History The following table describes important additions to the Amazon MWAA migration guide, beginning in March 2022. Change Description Date New topic on migrating workloads from AWS Data Added new information and guidance on migrating April 14, 2023 Pipeline to Amazon MWAA existing workloads from AWS Amazon MWAA Migration Guide launch March 7, 2022 Data Pipeline to Amazon MWAA. Use this informati on to help you design a migration plan. • Migrate workloads from AWS Data Pipeline to Amazon MWAA Amazon MWAA now offers detailed guidance on migrating to a new Amazon MWAA environment. The steps described in the Amazon MWAA Migration Guide apply to mgirating from an existing Amazon MWAA environment, or from a self-managed Apache Airflow deployment. • About the Amazon MWAA migration guide 26
amazon-mwaa-user-guide-001
amazon-mwaa-user-guide.pdf
1
User Guide Amazon Managed Workflows for Apache Airflow Copyright © 2025 Amazon Web Services, Inc. and/or its affiliates. All rights reserved. Amazon Managed Workflows for Apache Airflow User Guide Amazon Managed Workflows for Apache Airflow: User Guide Copyright © 2025 Amazon Web Services, Inc. and/or its affiliates. All rights reserved. Amazon's trademarks and trade dress may not be used in connection with any product or service that is not Amazon's, in any manner that is likely to cause confusion among customers, or in any manner that disparages or discredits Amazon. All other trademarks not owned by Amazon are the property of their respective owners, who may or may not be affiliated with, connected to, or sponsored by Amazon. Amazon Managed Workflows for Apache Airflow User Guide Table of Contents What Is Amazon MWAA? ................................................................................................................. 1 Features .......................................................................................................................................................... 1 Architecture .................................................................................................................................................... 2 Integration ...................................................................................................................................................... 4 Supported versions ...................................................................................................................................... 4 What's next? .................................................................................................................................................. 4 Quick start ....................................................................................................................................... 5 In this tutorial ............................................................................................................................................... 5 Prerequisites .................................................................................................................................................. 6 Step one: Save the AWS CloudFormation template locally ................................................................. 6 Step two: Create the stack using the AWS CLI .................................................................................... 16 Step three: Upload a DAG to Amazon S3 and run in the Apache Airflow UI ................................. 17 Step four: View logs in CloudWatch Logs ............................................................................................ 18 What's next? ................................................................................................................................................ 18 Get started ..................................................................................................................................... 19 Prerequisites ................................................................................................................................................ 19 About this guide ......................................................................................................................................... 19 Before you begin ........................................................................................................................................ 20 Available regions ........................................................................................................................................ 20 Create a bucket .......................................................................................................................................... 21 Before you begin .................................................................................................................................. 21 Create the bucket ................................................................................................................................. 22 What's next? ........................................................................................................................................... 23 Create the VPC network ........................................................................................................................... 24 Prerequisites ........................................................................................................................................... 24 Before you begin .................................................................................................................................. 25 Options to create the Amazon VPC network .................................................................................. 25 What's next? ........................................................................................................................................... 37 Create an environment ............................................................................................................................. 37 Before you begin .................................................................................................................................. 38 Apache Airflow versions ...................................................................................................................... 38 Create an environment ........................................................................................................................ 39 What's next? ................................................................................................................................................ 23 Managing access ............................................................................................................................ 44 iii Amazon Managed Workflows for Apache Airflow User Guide Accessing an Amazon MWAA environment .......................................................................................... 44 How it works .......................................................................................................................................... 45 Full console access ................................................................................................................................ 46 Full API access ....................................................................................................................................... 53 Read-only console access .................................................................................................................... 57 Apache Airflow UI access .................................................................................................................... 57 Apache Airflow Rest API access ......................................................................................................... 58 Apache Airflow CLI access .................................................................................................................. 59 Creating a JSON policy ........................................................................................................................ 60 Example use case .................................................................................................................................. 60 What's next? ........................................................................................................................................... 62 Service-linked role ..................................................................................................................................... 63 Service-linked role permissions for Amazon MWAA ...................................................................... 63 Creating a service-linked role for Amazon MWAA ......................................................................... 66 Editing a service-linked role for Amazon MWAA ........................................................................... 67 Deleting a service-linked role for Amazon MWAA ......................................................................... 67 Supported regions for Amazon MWAA service-linked roles ........................................................ 67 Policy updates ....................................................................................................................................... 67 Execution role ............................................................................................................................................. 68 Execution role overview ...................................................................................................................... 69 Create a new role ................................................................................................................................. 71 View and update an execution role policy ...................................................................................... 71 Grant access to Amazon S3 bucket with account-level public access block .............................. 73 Use Apache Airflow connections ....................................................................................................... 74 Sample policies ...................................................................................................................................... 74 What's next? ........................................................................................................................................... 80 Cross-service confused deputy prevention ........................................................................................... 80 Apache Airflow access modes ................................................................................................................. 81 Apache Airflow access modes ............................................................................................................ 82 Access modes overview ....................................................................................................................... 84 Setup for private and public access modes .................................................................................... 85 Accessing the VPC endpoint for your Apache Airflow Web server (private network access) ...................................................................................................................................................... 87 Accessing Apache Airflow ............................................................................................................. 88 Prerequisites ................................................................................................................................................ 88 Access ...................................................................................................................................................... 88 iv Amazon Managed Workflows for Apache Airflow User Guide AWS CLI ................................................................................................................................................... 89 Open the Apache Airflow UI ................................................................................................................... 89 Logging into Apache Airflow ................................................................................................................... 89 Create a web server access token ........................................................................................................... 89 Prerequisites ........................................................................................................................................... 90 Using the AWS CLI ............................................................................................................................... 90 Using a bash script ............................................................................................................................... 91 Using a Python script .......................................................................................................................... 91 What's next? ........................................................................................................................................... 92 Setting up a custom domain ................................................................................................................... 92 Configure the custom domain ........................................................................................................... 93 Set up the networking infrastructure ............................................................................................... 94 Apache Airflow CLI token ......................................................................................................................... 98 Prerequisites ........................................................................................................................................... 99 Using the AWS CLI ............................................................................................................................. 100 Using a curl script .............................................................................................................................. 100 Using a bash script ............................................................................................................................. 102 Using a Python script ........................................................................................................................ 103 What's next? ........................................................................................................................................ 106 Using the Apache Airflow REST API .................................................................................................... 106 Granting access to the Apache Airflow REST API: airflow:InvokeRestApi ..................... 108 Calling the Apache Airflow REST API ............................................................................................. 109 Creating a web server session token and calling the Apache Airflow REST API ..................... 110 Apache Airflow CLI command reference ............................................................................................. 113 Prerequisites ........................................................................................................................................ 113 What's changed in v2 ........................................................................................................................ 114 Supported CLI commands ................................................................................................................ 114 Sample code ........................................................................................................................................ 117 Managing connections ................................................................................................................ 120 Overview .................................................................................................................................................... 120 Apache Airflow packages ....................................................................................................................... 120 Provider packages for Apache Airflow v2.10.1 connections ...................................................... 121 Provider packages for Apache Airflow v2.9.2 connections ........................................................ 122 Provider packages for Apache Airflow v2.8.1 connections ........................................................ 123 Provider packages for Apache Airflow v2.7.2 connections ........................................................ 124 Provider packages for Apache Airflow v2.6.3 connections ........................................................ 125 v Amazon Managed Workflows for Apache Airflow User Guide Provider packages for Apache Airflow v2.5.1 connections ........................................................ 126 Provider
amazon-mwaa-user-guide-002
amazon-mwaa-user-guide.pdf
2
Airflow CLI command reference ............................................................................................. 113 Prerequisites ........................................................................................................................................ 113 What's changed in v2 ........................................................................................................................ 114 Supported CLI commands ................................................................................................................ 114 Sample code ........................................................................................................................................ 117 Managing connections ................................................................................................................ 120 Overview .................................................................................................................................................... 120 Apache Airflow packages ....................................................................................................................... 120 Provider packages for Apache Airflow v2.10.1 connections ...................................................... 121 Provider packages for Apache Airflow v2.9.2 connections ........................................................ 122 Provider packages for Apache Airflow v2.8.1 connections ........................................................ 123 Provider packages for Apache Airflow v2.7.2 connections ........................................................ 124 Provider packages for Apache Airflow v2.6.3 connections ........................................................ 125 v Amazon Managed Workflows for Apache Airflow User Guide Provider packages for Apache Airflow v2.5.1 connections ........................................................ 126 Provider packages for Apache Airflow v2.4.3 connections ........................................................ 127 Provider packages for Apache Airflow v2.2.2 connections ........................................................ 127 Provider packages for Apache Airflow v2.0.2 connections ........................................................ 128 Specifying newer provider packages .............................................................................................. 128 Connection types ..................................................................................................................................... 129 Example connection URI string ....................................................................................................... 130 Example connection template ......................................................................................................... 130 Example using an HTTP connection template for a Jdbc connection ..................................... 132 Configuring Secrets Manager ................................................................................................................ 134 Step one: Provide Amazon MWAA with permission to access Secrets Manager secret keys ........................................................................................................................................................ 135 Step two: Create the Secrets Manager backend as an Apache Airflow configuration option .................................................................................................................................................... 136 Step three: Generate an Apache Airflow AWS connection URI string ...................................... 137 Step four: Add the variables in Secrets Manager ........................................................................ 140 Step five: Add the connection in Secrets Manager ..................................................................... 141 Sample code ........................................................................................................................................ 142 Resources .............................................................................................................................................. 143 What's next? ........................................................................................................................................ 143 Managing environments ............................................................................................................. 144 Configuring the environment class ...................................................................................................... 144 Environment capabilities ................................................................................................................... 144 Apache Airflow Schedulers ............................................................................................................... 147 Configuring worker auto scaling .......................................................................................................... 147 How worker scaling works ................................................................................................................ 148 Using the Amazon MWAA console ................................................................................................. 148 Example high performance use case .............................................................................................. 149 Troubleshooting tasks stuck in the running state ....................................................................... 150 What's next? ........................................................................................................................................ 151 Configuring web server auto scaling ................................................................................................... 151 How web server scaling works ........................................................................................................ 151 Using the Amazon MWAA console ................................................................................................. 151 Using configuration options .................................................................................................................. 152 Prerequisites ........................................................................................................................................ 153 How it works ....................................................................................................................................... 153 vi Amazon Managed Workflows for Apache Airflow User Guide Using configuration options to load plugins in Apache Airflow v2 .......................................... 154 Configuration options overview ...................................................................................................... 154 Configuration reference .................................................................................................................... 155 Examples and sample code .............................................................................................................. 161 What's next? ........................................................................................................................................ 163 Upgrading the version ............................................................................................................................ 163 Upgrade your workflow resources .................................................................................................. 164 Specify the new version .................................................................................................................... 165 Using a startup script ............................................................................................................................. 166 Configure a startup script ................................................................................................................. 166 Install Linux runtimes ........................................................................................................................ 170 Set environment variables ................................................................................................................ 171 Working with DAGs ..................................................................................................................... 175 Amazon S3 bucket overview ................................................................................................................. 175 Adding or updating DAGs ...................................................................................................................... 176 Prerequisites ........................................................................................................................................ 176 How it works ....................................................................................................................................... 177 What's changed in v2 ........................................................................................................................ 177 Testing DAGs using the Amazon MWAA CLI utility ..................................................................... 178 Uploading DAG code to Amazon S3 ............................................................................................... 178 Specifying the path to a DAGs folder ............................................................................................ 179 Viewing changes on your Apache Airflow UI ............................................................................... 180 What's next? ........................................................................................................................................ 180 Installing custom plugins ....................................................................................................................... 180 Prerequisites ........................................................................................................................................ 181 How it works ....................................................................................................................................... 182 When to use the plugins .................................................................................................................. 182 Custom plugins overview .................................................................................................................. 183 Examples of custom plugins ............................................................................................................ 183 Creating a plugins.zip file ................................................................................................................. 193 Uploading plugins.zip to Amazon S3 ...................................................................................... 194 Installing custom plugins on your environment .......................................................................... 195 Example use cases for plugins.zip .................................................................................................. 196 What's next? ........................................................................................................................................ 196 Installing Python dependencies ............................................................................................................ 196 Prerequisites ........................................................................................................................................ 197 vii Amazon Managed Workflows for Apache Airflow User Guide How it works ....................................................................................................................................... 198 Python dependencies overview ....................................................................................................... 198 Creating a requirements.txt file ...................................................................................................... 199 Uploading requirements.txt to Amazon S3 .......................................................................... 202 Installing Python dependencies on your environment ............................................................... 203 Viewing logs for your requirements.txt ................................................................................. 204 What's next? ........................................................................................................................................ 205 Deleting files on Amazon S3 ................................................................................................................. 205 Prerequisites ........................................................................................................................................ 206 Versioning overview ........................................................................................................................... 206 How it works ....................................................................................................................................... 206 Deleting a DAG on Amazon S3 ........................................................................................................ 207 Removing "current" plugins.zip or requirements.txt ................................................................... 207 Delete "non-current" plugins.zip or requirements.txt ................................................................. 208 Deleting files with lifecycles ............................................................................................................ 208 Example lifecycle policy .................................................................................................................... 208 What's next? ........................................................................................................................................ 209 Networking .................................................................................................................................. 210 About networking .................................................................................................................................... 210 Terms ..................................................................................................................................................... 211 What's supported ............................................................................................................................... 211 VPC infrastructure overview ............................................................................................................. 211 Example use cases for an Amazon VPC and Apache Airflow access mode ............................. 214 Security in your VPC ............................................................................................................................... 216 Terms ..................................................................................................................................................... 217 Security overview ............................................................................................................................... 217 Network access control lists (ACLs) ................................................................................................ 218 VPC security groups ........................................................................................................................... 218 VPC endpoint policies (private routing only) ................................................................................ 220 Managing access to VPC endpoints ..................................................................................................... 221 Pricing ................................................................................................................................................... 222 VPC endpoint overview ..................................................................................................................... 222 Permission to use other AWS services ........................................................................................... 223 Viewing VPC endpoints ..................................................................................................................... 223 Accessing the VPC endpoint for your Apache Airflow Web server (private network access) ................................................................................................................................................... 225 viii Amazon Managed Workflows for Apache Airflow User Guide VPC service endpoints in private Amazon VPCs ............................................................................... 227 Pricing ................................................................................................................................................... 227 Private network and private routing .............................................................................................. 228 (Required) VPC endpoints ................................................................................................................. 229 Attaching
amazon-mwaa-user-guide-003
amazon-mwaa-user-guide.pdf
3
overview ............................................................................................................................... 217 Network access control lists (ACLs) ................................................................................................ 218 VPC security groups ........................................................................................................................... 218 VPC endpoint policies (private routing only) ................................................................................ 220 Managing access to VPC endpoints ..................................................................................................... 221 Pricing ................................................................................................................................................... 222 VPC endpoint overview ..................................................................................................................... 222 Permission to use other AWS services ........................................................................................... 223 Viewing VPC endpoints ..................................................................................................................... 223 Accessing the VPC endpoint for your Apache Airflow Web server (private network access) ................................................................................................................................................... 225 viii Amazon Managed Workflows for Apache Airflow User Guide VPC service endpoints in private Amazon VPCs ............................................................................... 227 Pricing ................................................................................................................................................... 227 Private network and private routing .............................................................................................. 228 (Required) VPC endpoints ................................................................................................................. 229 Attaching the required VPC endpoints .......................................................................................... 229 (Optional) Enable private IP addresses for your Amazon S3 VPC interface endpoint ........... 233 Managing your own Amazon VPC endpoints ..................................................................................... 234 Creating an environment in a shared Amazon VPC .................................................................... 234 Tutorials ....................................................................................................................................... 244 Tutorial: AWS Client VPN ....................................................................................................................... 244 Private network ................................................................................................................................... 245 Use cases .............................................................................................................................................. 246 Before you begin ................................................................................................................................ 246 Objectives ............................................................................................................................................. 246 (Optional) Step one: Identify your VPC, CIDR rules, and VPC security(s) ................................ 247 Step two: Create the server and client certificates ..................................................................... 248 Step three: Save the AWS CloudFormation template locally .................................................... 249 Step four: Create the Client VPN AWS CloudFormation stack .................................................. 251 Step five: Associate subnets to your Client VPN ......................................................................... 251 Step six: Add an authorization ingress rule to your Client VPN ................................................ 252 Step seven: Download the Client VPN endpoint configuration file ......................................... 252 Step eight: Connect to the AWS Client VPN ................................................................................ 254 What's next? ........................................................................................................................................ 255 Tutorial: Linux Bastion Host .................................................................................................................. 255 Private network ................................................................................................................................... 255 Use cases .............................................................................................................................................. 256 Before you begin ................................................................................................................................ 257 Objectives ............................................................................................................................................. 257 Step one: Create the bastion instance ........................................................................................... 257 Step two: Create the ssh tunnel ..................................................................................................... 259 Step three: Configure the bastion security group as an inbound rule ..................................... 260 Step four: Copy the Apache Airflow URL ...................................................................................... 261 Step five: Configure proxy settings ................................................................................................ 261 Step six: Open the Apache Airflow UI ........................................................................................... 264 What's next? ........................................................................................................................................ 264 Tutorial: Restricting users to a subset of DAGs ................................................................................. 264 ix Amazon Managed Workflows for Apache Airflow User Guide Prerequisites ........................................................................................................................................ 265 Step one: Provide Amazon MWAA web server access to your IAM principal with the default Public Apache Airflow role. ........................................................................................................... 265 Step two: Create a new Apache Airflow custom role ................................................................. 266 Step three: Assign the role you created to your Amazon MWAA user ..................................... 267 Next steps ............................................................................................................................................ 268 Related resources ................................................................................................................................ 268 Tutorial: Automate managing your own environment endpoints .................................................. 268 Prerequisites ........................................................................................................................................ 269 Create the Amazon VPC .................................................................................................................... 269 Create the Lambda function ............................................................................................................ 270 Create the EventBridge rule ............................................................................................................. 270 Create the environment .................................................................................................................... 271 Code examples ............................................................................................................................. 273 Import variables DAG .............................................................................................................................. 274 Version .................................................................................................................................................. 274 Prerequisites ........................................................................................................................................ 274 Permissions .......................................................................................................................................... 274 Dependencies ....................................................................................................................................... 274 Code sample ........................................................................................................................................ 275 What's next? ........................................................................................................................................ 276 Using the SSHOperator ........................................................................................................................ 276 Version .................................................................................................................................................. 277 Prerequisites ........................................................................................................................................ 277 Permissions .......................................................................................................................................... 277 Requirements ....................................................................................................................................... 278 Copy your secret key to Amazon S3 .............................................................................................. 278 Create a new Apache Airflow connection ..................................................................................... 278 Code sample ........................................................................................................................................ 279 Apache Airflow Snowflake connection in Secrets Manager ............................................................ 281 Version .................................................................................................................................................. 281 Prerequisites ........................................................................................................................................ 281 Permissions .......................................................................................................................................... 281 Requirements ....................................................................................................................................... 282 Code sample ........................................................................................................................................ 282 What's next? ........................................................................................................................................ 283 x Amazon Managed Workflows for Apache Airflow User Guide Using a DAG to write custom metrics ................................................................................................. 283 Version .................................................................................................................................................. 284 Prerequisites ........................................................................................................................................ 284 Permissions .......................................................................................................................................... 284 Dependencies ....................................................................................................................................... 284 Code example ...................................................................................................................................... 284 Aurora PostgreSQL database cleanup ................................................................................................. 287 Version .................................................................................................................................................. 288 Prerequisites ........................................................................................................................................ 288 Dependencies ....................................................................................................................................... 288 Code sample ........................................................................................................................................ 288 Exporting environment metadata to Amazon S3 ............................................................................. 291 Version .................................................................................................................................................. 292 Prerequisites ........................................................................................................................................ 292 Permissions .......................................................................................................................................... 292 Requirements ....................................................................................................................................... 293 Code sample ........................................................................................................................................ 293 Using an Apache Airflow variable in Secrets Manager .................................................................... 295 Version .................................................................................................................................................. 296 Prerequisites ........................................................................................................................................ 296 Permissions .......................................................................................................................................... 296 Requirements ....................................................................................................................................... 296 Code sample ........................................................................................................................................ 297 What's next? ........................................................................................................................................ 298 Using an Apache Airflow connection in Secrets Manager ............................................................... 298 Version .................................................................................................................................................. 298 Prerequisites ........................................................................................................................................ 298 Permissions .......................................................................................................................................... 299 Requirements ....................................................................................................................................... 296 Code sample ........................................................................................................................................ 299 What's next? ........................................................................................................................................ 302 Custom plugin with Oracle .................................................................................................................... 302 Version .................................................................................................................................................. 303 Prerequisites ........................................................................................................................................ 303 Permissions .......................................................................................................................................... 303 Requirements ....................................................................................................................................... 303 xi Amazon Managed Workflows for Apache Airflow User Guide Code sample ........................................................................................................................................ 304 Create the custom plugin ................................................................................................................. 305 Airflow configuration options .......................................................................................................... 308 What's next? ........................................................................................................................................ 308 Custom plugin with environment variables ....................................................................................... 308 Version .................................................................................................................................................. 309 Prerequisites ........................................................................................................................................ 309 Permissions .......................................................................................................................................... 309 Requirements ....................................................................................................................................... 309 Custom plugin ..................................................................................................................................... 309 Plugins.zip ............................................................................................................................................ 310 Airflow configuration options .......................................................................................................... 310 What's next? ........................................................................................................................................ 310 Changing a DAG's timezone .................................................................................................................. 311 Version .................................................................................................................................................. 311 Prerequisites ........................................................................................................................................ 311 Permissions .......................................................................................................................................... 311 Create a plugin to change the timezone in Airflow logs ........................................................... 312 Create a plugins.zip ..................................................................................................................... 312 Code sample ........................................................................................................................................ 313 What's next? ........................................................................................................................................ 314
amazon-mwaa-user-guide-004
amazon-mwaa-user-guide.pdf
4
Workflows for Apache Airflow User Guide Code sample ........................................................................................................................................ 304 Create the custom plugin ................................................................................................................. 305 Airflow configuration options .......................................................................................................... 308 What's next? ........................................................................................................................................ 308 Custom plugin with environment variables ....................................................................................... 308 Version .................................................................................................................................................. 309 Prerequisites ........................................................................................................................................ 309 Permissions .......................................................................................................................................... 309 Requirements ....................................................................................................................................... 309 Custom plugin ..................................................................................................................................... 309 Plugins.zip ............................................................................................................................................ 310 Airflow configuration options .......................................................................................................... 310 What's next? ........................................................................................................................................ 310 Changing a DAG's timezone .................................................................................................................. 311 Version .................................................................................................................................................. 311 Prerequisites ........................................................................................................................................ 311 Permissions .......................................................................................................................................... 311 Create a plugin to change the timezone in Airflow logs ........................................................... 312 Create a plugins.zip ..................................................................................................................... 312 Code sample ........................................................................................................................................ 313 What's next? ........................................................................................................................................ 314 Refreshing an AWS CodeArtifact token at runtime .......................................................................... 314 Version .................................................................................................................................................. 315 Prerequisites ........................................................................................................................................ 315 Permissions .......................................................................................................................................... 315 Code sample ........................................................................................................................................ 316 What's next? ........................................................................................................................................ 317 Custom plugin with Apache Hive and Hadoop ................................................................................. 317 Version .................................................................................................................................................. 318 Prerequisites ........................................................................................................................................ 318 Permissions .......................................................................................................................................... 318 Requirements ....................................................................................................................................... 296 Download dependencies ................................................................................................................... 319 Custom plugin ..................................................................................................................................... 320 Plugins.zip ............................................................................................................................................ 320 Code sample ........................................................................................................................................ 321 xii Amazon Managed Workflows for Apache Airflow User Guide Airflow configuration options .......................................................................................................... 321 What's next? ........................................................................................................................................ 321 Custom plugin to patch PythonVirtualenvOperator ........................................................................ 322 Version .................................................................................................................................................. 322 Prerequisites ........................................................................................................................................ 322 Permissions .......................................................................................................................................... 323 Requirements ....................................................................................................................................... 323 Custom plugin sample code ............................................................................................................. 323 Plugins.zip ............................................................................................................................................ 325 Code sample ........................................................................................................................................ 325 Airflow configuration options .......................................................................................................... 327 What's next? ........................................................................................................................................ 328 Invoking DAGs with Lambda ................................................................................................................. 328 Version .................................................................................................................................................. 328 Prerequisites ........................................................................................................................................ 328 Permissions .......................................................................................................................................... 329 Dependencies ....................................................................................................................................... 329 Code example ...................................................................................................................................... 329 Invoking DAGs in different environments ........................................................................................... 331 Version .................................................................................................................................................. 331 Prerequisites ........................................................................................................................................ 331 Permissions .......................................................................................................................................... 332 Dependencies ....................................................................................................................................... 332 Code example ...................................................................................................................................... 332 Amazon RDS server ................................................................................................................................. 334 Version .................................................................................................................................................. 334 Prerequisites ........................................................................................................................................ 335 Dependencies ....................................................................................................................................... 288 Apache Airflow v2 connection ......................................................................................................... 335 Code sample ........................................................................................................................................ 336 What's next? ........................................................................................................................................ 338 Amazon EMR integration ....................................................................................................................... 339 Version .................................................................................................................................................. 339 Code sample ........................................................................................................................................ 339 Amazon EKS (eksctl) ................................................................................................................................ 342 Version .................................................................................................................................................. 342 xiii Amazon Managed Workflows for Apache Airflow User Guide Prerequisites ........................................................................................................................................ 342 Create a public key for Amazon EC2 .............................................................................................. 343 Create the cluster ............................................................................................................................... 343 Create a mwaa namespace ................................................................................................................ 344 Create a role for the mwaa namespace .......................................................................................... 344 Create and attach an IAM role for the Amazon EKS cluster ...................................................... 345 Create the requirements.txt file ...................................................................................................... 349 Create an identity mapping for Amazon EKS ............................................................................... 349 Create the kubeconfig ................................................................................................................... 349 Create a DAG ....................................................................................................................................... 350 Add the DAG and kube_config.yaml to the Amazon S3 bucket ......................................... 352 Enable and trigger the example ...................................................................................................... 352 Using the ECSOperator ........................................................................................................................ 353 Version .................................................................................................................................................. 353 Prerequisites ........................................................................................................................................ 353 Permissions .......................................................................................................................................... 354 Create an Amazon ECS cluster ........................................................................................................ 355 Code sample ........................................................................................................................................ 360 Using dbt with Amazon MWAA ............................................................................................................ 363 Version .................................................................................................................................................. 363 Prerequisites ........................................................................................................................................ 363 Dependencies ....................................................................................................................................... 364 Upload a dbt project to Amazon S3 .............................................................................................. 365 Use a DAG to verify dbt dependency installation ....................................................................... 366 Use a DAG to run a dbt project ....................................................................................................... 366 AWS blogs and tutorials ......................................................................................................................... 367 Best practices ............................................................................................................................... 368 Performance tuning for Apache Airflow ............................................................................................. 368 Adding an Apache Airflow configuration option ......................................................................... 368 Apache Airflow scheduler ................................................................................................................. 369 DAG folders .......................................................................................................................................... 374 DAG files ............................................................................................................................................... 376 Tasks ...................................................................................................................................................... 380 Managing Python dependencies .......................................................................................................... 385 Testing DAGs using the Amazon MWAA CLI utility ..................................................................... 385 Installing Python dependencies using PyPi.org Requirements File Format ............................ 386 xiv Amazon Managed Workflows for Apache Airflow User Guide Enabling logs on the Amazon MWAA console ............................................................................. 393 Viewing logs on the CloudWatch Logs console ........................................................................... 393 Viewing errors in the Apache Airflow UI ....................................................................................... 394 Example requirements.txt scenarios ....................................................................................... 395 Monitoring and metrics ............................................................................................................... 396 Overview .................................................................................................................................................... 396 Amazon CloudWatch overview ........................................................................................................ 397 AWS CloudTrail overview .................................................................................................................. 397 Viewing audit logs ................................................................................................................................... 397 Creating a trail in CloudTrail ............................................................................................................ 398 Viewing events with CloudTrail Event History ............................................................................. 398 Example trail for CreateEnvironment ....................................................................................... 398 What's next? ........................................................................................................................................ 400 Viewing Airflow logs ............................................................................................................................... 400 Pricing ................................................................................................................................................... 400 Before you begin ................................................................................................................................ 401 Log types .............................................................................................................................................. 401 Enabling Apache Airflow logs .......................................................................................................... 401 Viewing Apache Airflow logs ........................................................................................................... 402 Example scheduler logs ..................................................................................................................... 402 What's next? ........................................................................................................................................ 403 Monitoring dashboards and alarms ..................................................................................................... 403 Metrics ................................................................................................................................................... 404 Alarm states overview ....................................................................................................................... 404 Example custom dashboards and alarms ...................................................................................... 404 Deleting metrics and dashboards ................................................................................................... 410 What's next? ........................................................................................................................................ 410 Apache Airflow v2 environment metrics ............................................................................................ 410 Terms ..................................................................................................................................................... 411 Dimensions ........................................................................................................................................... 411 Accessing metrics in the CloudWatch console .............................................................................. 412 Apache Airflow metrics available in CloudWatch ........................................................................ 413 Choosing which metrics are reported ............................................................................................ 429 What's next? ........................................................................................................................................ 430 Container, queue, and database metrics ............................................................................................. 430 Terms ..................................................................................................................................................... 431 xv Amazon Managed Workflows for Apache Airflow User Guide Dimensions ........................................................................................................................................... 431 Accessing metrics ................................................................................................................................ 432 List of metrics ..................................................................................................................................... 433 Security ........................................................................................................................................ 436 Data Protection ........................................................................................................................................ 437 Encryption ............................................................................................................................................ 437 Using customer managed keys ........................................................................................................ 439 AWS Identity and Access Management ............................................................................................... 443
amazon-mwaa-user-guide-005
amazon-mwaa-user-guide.pdf
5
................................................................................................... 410 What's next? ........................................................................................................................................ 410 Apache Airflow v2 environment metrics ............................................................................................ 410 Terms ..................................................................................................................................................... 411 Dimensions ........................................................................................................................................... 411 Accessing metrics in the CloudWatch console .............................................................................. 412 Apache Airflow metrics available in CloudWatch ........................................................................ 413 Choosing which metrics are reported ............................................................................................ 429 What's next? ........................................................................................................................................ 430 Container, queue, and database metrics ............................................................................................. 430 Terms ..................................................................................................................................................... 431 xv Amazon Managed Workflows for Apache Airflow User Guide Dimensions ........................................................................................................................................... 431 Accessing metrics ................................................................................................................................ 432 List of metrics ..................................................................................................................................... 433 Security ........................................................................................................................................ 436 Data Protection ........................................................................................................................................ 437 Encryption ............................................................................................................................................ 437 Using customer managed keys ........................................................................................................ 439 AWS Identity and Access Management ............................................................................................... 443 Audience ............................................................................................................................................... 444 Authenticating With Identities ......................................................................................................... 444 Managing Access Using Policies ...................................................................................................... 447 Allowing users to view their own permissions ............................................................................. 450 Troubleshooting Amazon Managed Workflows for Apache Airflow identity and access ...... 451 How Amazon MWAA works with IAM ............................................................................................ 452 Compliance Validation ............................................................................................................................ 457 Resilience ................................................................................................................................................... 458 Infrastructure Security ............................................................................................................................ 458 Configuration and Vulnerability Analysis ........................................................................................... 459 Best practices ............................................................................................................................................ 459 Security best practices in Apache Airflow ..................................................................................... 460 Versions ........................................................................................................................................ 462 About Amazon MWAA versions ............................................................................................................ 462 Latest version ............................................................................................................................................ 462 Apache Airflow versions ......................................................................................................................... 462 Apache Airflow components ................................................................................................................. 464 Schedulers ............................................................................................................................................ 464 Workers ................................................................................................................................................. 464 Upgrading the Apache Airflow version ............................................................................................... 465 Apache Airflow deprecated versions ................................................................................................... 465 Apache Airflow version support and FAQ .......................................................................................... 465 Frequently asked questions .............................................................................................................. 466 Endpoints and quotas ................................................................................................................. 468 Service endpoints ..................................................................................................................................... 468 Service quotas .......................................................................................................................................... 468 Increasing quotas ..................................................................................................................................... 469 FAQs .............................................................................................................................................. 470 xvi Amazon Managed Workflows for Apache Airflow User Guide Supported versions .................................................................................................................................. 471 Apache Airflow support .................................................................................................................... 471 Apache Airflow versions .................................................................................................................... 471 Python version .................................................................................................................................... 471 Use cases ................................................................................................................................................... 473 When should I use AWS Step Functions vs. Amazon MWAA? ................................................... 473 Environment specifications .................................................................................................................... 473 How much task storage is available to each environment? ....................................................... 473 Default OS ............................................................................................................................................ 473 Custom images .................................................................................................................................... 473 HIPAA compliance .............................................................................................................................. 473 Does Amazon MWAA support Spot Instances? ............................................................................ 474 Custom domain ................................................................................................................................... 474 SSH access ............................................................................................................................................ 474 Self-referencing rule .......................................................................................................................... 475 Custom metrics ................................................................................................................................... 475 Store data ............................................................................................................................................ 475 Worker quota ....................................................................................................................................... 475 Shared Amazon VPCs ........................................................................................................................ 475 Shared Amazon VPCs ........................................................................................................................ 476 Metrics ........................................................................................................................................................ 476 Worker metrics .................................................................................................................................... 476 Custom metrics ................................................................................................................................... 476 DAGs, Operators, Connections, and other questions ........................................................................ 476 PythonVirtualenvOperator ............................................................................................................... 476 How long does it take Amazon MWAA to recognize a new DAG file? ...................................... 476 Why is my DAG file not picked up by Apache Airflow? .............................................................. 477 Remove plugins.zip or requirements.txt ........................................................................................ 477 Remove plugins.zip or requirements.txt ........................................................................................ 477 Can I use AWS Database Migration Service (DMS) Operators? ................................................. 477 When I access the Airflow REST API using the AWS credentials, can I increase the throttling limit to more than 10 transactions per second (TPS)? ............................................................... 478 Troubleshooting ........................................................................................................................... 479 Apache Airflow v2 ................................................................................................................................... 482 Connections ......................................................................................................................................... 482 Web server ........................................................................................................................................... 485 xvii Amazon Managed Workflows for Apache Airflow User Guide Tasks ...................................................................................................................................................... 486 CLI .......................................................................................................................................................... 488 Operators .............................................................................................................................................. 489 Apache Airflow v1 ................................................................................................................................... 491 Updating requirements.txt ............................................................................................................... 492 Broken DAG .......................................................................................................................................... 492 Operators .............................................................................................................................................. 494 Connections ......................................................................................................................................... 495 Web server ........................................................................................................................................... 497 Tasks ...................................................................................................................................................... 498 CLI .......................................................................................................................................................... 500 Amazon MWAA Create/Update ............................................................................................................. 501 Updating requirements.txt ....................................................................................................... 502 Plugins ................................................................................................................................................... 503 Create bucket ...................................................................................................................................... 503 Create environment ........................................................................................................................... 504 Update environment .......................................................................................................................... 506 Access environment ............................................................................................................................ 507 CloudWatch Logs and CloudTrail ......................................................................................................... 508 Logs ....................................................................................................................................................... 508 Document History ........................................................................................................................ 513 xviii Amazon Managed Workflows for Apache Airflow User Guide What Is Amazon Managed Workflows for Apache Airflow? Use Amazon Managed Workflows for Apache Airflow, a managed orchestration service for Apache Airflow, to setup and operate data pipelines in the cloud at scale. Apache Airflow is an open-source tool used to programmatically author, schedule, and monitor sequences of processes and tasks referred to as workflows. With Amazon MWAA, you can use Apache Airflow and Python to create workflows without having to manage the underlying infrastructure for scalability, availability, and security. Amazon MWAA automatically scales its workflow execution capacity to meet your needs, and integrates with AWS security services to help provide you with fast and secure access to your data. Content • Features • Architecture • Integration • Supported versions • What's next? Features Review the following features to learn how Amazon MWAA can simplify the management of your Apache Airflow workflows. • Automatic Airflow setup – Quickly setup Apache Airflow by choosing an Apache Airflow version when you create an Amazon MWAA environment. Amazon MWAA sets up Apache Airflow for you using the same Apache Airflow user interface and open-source code that you can download on the Internet. • Automatic scaling – Automatically scale Apache Airflow Workers by setting the minimum and maximum number of Workers that run in your environment. Amazon MWAA monitors the Workers in your environment and uses its autoscaling component to add Workers to
amazon-mwaa-user-guide-006
amazon-mwaa-user-guide.pdf
6
learn how Amazon MWAA can simplify the management of your Apache Airflow workflows. • Automatic Airflow setup – Quickly setup Apache Airflow by choosing an Apache Airflow version when you create an Amazon MWAA environment. Amazon MWAA sets up Apache Airflow for you using the same Apache Airflow user interface and open-source code that you can download on the Internet. • Automatic scaling – Automatically scale Apache Airflow Workers by setting the minimum and maximum number of Workers that run in your environment. Amazon MWAA monitors the Workers in your environment and uses its autoscaling component to add Workers to meet demand, up to and until it reaches the maximum number of Workers you defined. • Built-in authentication – Enable role-based authentication and authorization for your Apache Airflow Web server by defining the access control policies in AWS Identity and Access Features 1 Amazon Managed Workflows for Apache Airflow User Guide Management (IAM). The Apache Airflow Workers assume these policies for secure access to AWS services. • Built-in security – The Apache Airflow Workers and Schedulers run in Amazon MWAA's Amazon VPC. Data is also automatically encrypted using AWS Key Management Service, so your environment is secure by default. • Public or private access modes – Access your Apache Airflow Web server using a private, or public access mode. The Public network access mode uses a VPC endpoint for your Apache Airflow Web server that is accessible over the Internet. The Private network access mode uses a VPC endpoint for your Apache Airflow Web server that is accessible in your VPC. In both cases, access for your Apache Airflow users is controlled by the access control policy you define in AWS Identity and Access Management (IAM), and AWS SSO. • Streamlined upgrades and patches – Amazon MWAA provides new versions of Apache Airflow periodically. The Amazon MWAA team will update and patch the images for these versions. • Workflow monitoring – View Apache Airflow logs and Apache Airflow metrics in Amazon CloudWatch to identify Apache Airflow task delays or workflow errors without the need for additional third-party tools. Amazon MWAA automatically sends environment metrics—and if enabled—Apache Airflow logs to CloudWatch. • AWS integration – Amazon MWAA supports open-source integrations with Amazon Athena, AWS Batch, Amazon CloudWatch, Amazon DynamoDB, AWS DataSync, Amazon EMR, AWS Fargate, Amazon EKS, Amazon Data Firehose, AWS Glue, AWS Lambda, Amazon Redshift, Amazon SQS, Amazon SNS, Amazon SageMaker AI, and Amazon S3, as well as hundreds of built-in and community-created operators and sensors. • Worker fleets – Amazon MWAA offers support for using containers to scale the worker fleet on demand and reduce scheduler outages using Amazon ECS on AWS Fargate. Operators that invoke tasks on Amazon ECS containers, and Kubernetes operators that create and run pods on a Kubernetes cluster are supported. Architecture All of the components contained in the outer box (in the image below) appear as a single Amazon MWAA environment in your account. The Apache Airflow Scheduler and Workers are AWS Fargate containers that connect to the private subnets in the Amazon VPC for your environment. Each environment has its own Apache Airflow metadatabase managed by AWS that is accessible to the Scheduler and Workers Fargate containers via a privately-secured VPC endpoint. Architecture 2 Amazon Managed Workflows for Apache Airflow User Guide Amazon CloudWatch, Amazon S3, Amazon SQS, and AWS KMS are separate from Amazon MWAA and need to be accessible from the Apache Airflow Scheduler(s) and Workers in the Fargate containers. The Apache Airflow Web server can be accessed either over the Internet by selecting the Public network Apache Airflow access mode, or within your VPC by selecting the Private network Apache Airflow access mode. In both cases, access for your Apache Airflow users is controlled by the access control policy you define in AWS Identity and Access Management (IAM). Note Multiple Apache Airflow Schedulers are only available with Apache Airflow v2 and above. Learn more about the Apache Airflow task lifecycle at Concepts in the Apache Airflow reference guide. Architecture 3 Amazon Managed Workflows for Apache Airflow User Guide Integration The active and growing Apache Airflow open-source community provides operators (plugins that simplify connections to services) for Apache Airflow to integrate with AWS services. This includes services such as Amazon S3, Amazon Redshift, Amazon EMR, AWS Batch, and Amazon SageMaker AI, as well as services on other cloud platforms. Using Apache Airflow with Amazon MWAA fully supports integration with AWS services and popular third-party tools such as Apache Hadoop, Presto, Hive, and Spark to perform data processing tasks. Amazon MWAA is committed to maintaining compatibility with the Apache Airflow API, and Amazon MWAA intends to provide reliable integrations to AWS services and make them available to the community, and be involved in community feature development. For sample code, see Code examples
amazon-mwaa-user-guide-007
amazon-mwaa-user-guide.pdf
7
with AWS services. This includes services such as Amazon S3, Amazon Redshift, Amazon EMR, AWS Batch, and Amazon SageMaker AI, as well as services on other cloud platforms. Using Apache Airflow with Amazon MWAA fully supports integration with AWS services and popular third-party tools such as Apache Hadoop, Presto, Hive, and Spark to perform data processing tasks. Amazon MWAA is committed to maintaining compatibility with the Apache Airflow API, and Amazon MWAA intends to provide reliable integrations to AWS services and make them available to the community, and be involved in community feature development. For sample code, see Code examples for Amazon Managed Workflows for Apache Airflow. Supported versions Amazon MWAA supports multiple versions of Apache Airflow. For more information about the Apache Airflow versions we support and the Apache Airflow components included with each version, see Apache Airflow versions on Amazon Managed Workflows for Apache Airflow. What's next? • Get started with a single AWS CloudFormation template that creates an Amazon S3 bucket for your Airflow DAGs and supporting files, an Amazon VPC with public routing, and an Amazon MWAA environment in Quick start tutorial for Amazon Managed Workflows for Apache Airflow. • Get started incrementally by creating an Amazon S3 bucket for your Airflow DAGs and supporting files, choosing from one of three Amazon VPC networking options, and creating an Amazon MWAA environment in Get started with Amazon Managed Workflows for Apache Airflow. Integration 4 Amazon Managed Workflows for Apache Airflow User Guide Quick start tutorial for Amazon Managed Workflows for Apache Airflow This quick start tutorial uses an AWS CloudFormation template that creates the Amazon VPC infrastructure, an Amazon S3 bucket with a dags folder, and an Amazon Managed Workflows for Apache Airflow environment at the same time. Topics • In this tutorial • Prerequisites • Step one: Save the AWS CloudFormation template locally • Step two: Create the stack using the AWS CLI • Step three: Upload a DAG to Amazon S3 and run in the Apache Airflow UI • Step four: View logs in CloudWatch Logs • What's next? In this tutorial This tutorial walks you through three AWS Command Line Interface (AWS CLI) commands to upload a DAG to Amazon S3, run the DAG in Apache Airflow, and view logs in CloudWatch. It concludes by walking you through the steps to create an IAM policy for an Apache Airflow development team. Note The AWS CloudFormation template on this page creates an Amazon Managed Workflows for Apache Airflow environment for the latest version of Apache Airflow available in AWS CloudFormation. The latest version available is Apache Airflow v2.10.3. The AWS CloudFormation template on this page creates the following: • VPC infrastructure. The template uses Public routing over the Internet. It uses the Public network access mode for the Apache Airflow Web server in WebserverAccessMode: PUBLIC_ONLY. In this tutorial 5 Amazon Managed Workflows for Apache Airflow User Guide • Amazon S3 bucket. The template creates an Amazon S3 bucket with a dags folder. It's configured to Block all public access, with Bucket Versioning enabled, as defined in Create an Amazon S3 bucket for Amazon MWAA. • Amazon MWAA environment. The template creates an Amazon MWAA environment that's associated to the dags folder on the Amazon S3 bucket, an execution role with permission to AWS services used by Amazon MWAA, and the default for encryption using an AWS owned key, as defined in Create an Amazon MWAA environment. • CloudWatch Logs. The template enables Apache Airflow logs in CloudWatch at the "INFO" level and up for the Airflow scheduler log group, Airflow web server log group, Airflow worker log group, Airflow DAG processing log group, and the Airflow task log group, as defined in Viewing Airflow logs in Amazon CloudWatch. In this tutorial, you'll complete the following tasks: • Upload and run a DAG. Upload Apache Airflow's tutorial DAG for the latest Amazon MWAA supported Apache Airflow version to Amazon S3, and then run in the Apache Airflow UI, as defined in Adding or updating DAGs. • View logs. View the Airflow web server log group in CloudWatch Logs, as defined in Viewing Airflow logs in Amazon CloudWatch. • Create an access control policy. Create an access control policy in IAM for your Apache Airflow development team, as defined in Accessing an Amazon MWAA environment. Prerequisites The AWS Command Line Interface (AWS CLI) is an open source tool that enables you to interact with AWS services using commands in your command-line shell. To complete the steps on this page, you need the following: • AWS CLI – Install version 2. • AWS CLI – Quick configuration with aws configure. Step one: Save the AWS CloudFormation template locally • Copy the contents of the following template and save locally as mwaa-public- network.yml. You can also download the template.
amazon-mwaa-user-guide-008
amazon-mwaa-user-guide.pdf
8
control policy in IAM for your Apache Airflow development team, as defined in Accessing an Amazon MWAA environment. Prerequisites The AWS Command Line Interface (AWS CLI) is an open source tool that enables you to interact with AWS services using commands in your command-line shell. To complete the steps on this page, you need the following: • AWS CLI – Install version 2. • AWS CLI – Quick configuration with aws configure. Step one: Save the AWS CloudFormation template locally • Copy the contents of the following template and save locally as mwaa-public- network.yml. You can also download the template. Prerequisites 6 Amazon Managed Workflows for Apache Airflow User Guide AWSTemplateFormatVersion: "2010-09-09" Parameters: EnvironmentName: Description: An environment name that is prefixed to resource names Type: String Default: MWAAEnvironment VpcCIDR: Description: The IP range (CIDR notation) for this VPC Type: String Default: 10.192.0.0/16 PublicSubnet1CIDR: Description: The IP range (CIDR notation) for the public subnet in the first Availability Zone Type: String Default: 10.192.10.0/24 PublicSubnet2CIDR: Description: The IP range (CIDR notation) for the public subnet in the second Availability Zone Type: String Default: 10.192.11.0/24 PrivateSubnet1CIDR: Description: The IP range (CIDR notation) for the private subnet in the first Availability Zone Type: String Default: 10.192.20.0/24 PrivateSubnet2CIDR: Description: The IP range (CIDR notation) for the private subnet in the second Availability Zone Type: String Default: 10.192.21.0/24 MaxWorkerNodes: Description: The maximum number of workers that can run in the environment Type: Number Default: 2 DagProcessingLogs: Description: Log level for DagProcessing Type: String Step one: Save the AWS CloudFormation template locally 7 Amazon Managed Workflows for Apache Airflow User Guide Default: INFO SchedulerLogsLevel: Description: Log level for SchedulerLogs Type: String Default: INFO TaskLogsLevel: Description: Log level for TaskLogs Type: String Default: INFO WorkerLogsLevel: Description: Log level for WorkerLogs Type: String Default: INFO WebserverLogsLevel: Description: Log level for WebserverLogs Type: String Default: INFO Resources: ##################################################################################################################### # CREATE VPC ##################################################################################################################### VPC: Type: AWS::EC2::VPC Properties: CidrBlock: !Ref VpcCIDR EnableDnsSupport: true EnableDnsHostnames: true Tags: - Key: Name Value: MWAAEnvironment InternetGateway: Type: AWS::EC2::InternetGateway Properties: Tags: - Key: Name Value: MWAAEnvironment InternetGatewayAttachment: Type: AWS::EC2::VPCGatewayAttachment Step one: Save the AWS CloudFormation template locally 8 Amazon Managed Workflows for Apache Airflow User Guide Properties: InternetGatewayId: !Ref InternetGateway VpcId: !Ref VPC PublicSubnet1: Type: AWS::EC2::Subnet Properties: VpcId: !Ref VPC AvailabilityZone: !Select [ 0, !GetAZs '' ] CidrBlock: !Ref PublicSubnet1CIDR MapPublicIpOnLaunch: true Tags: - Key: Name Value: !Sub ${EnvironmentName} Public Subnet (AZ1) PublicSubnet2: Type: AWS::EC2::Subnet Properties: VpcId: !Ref VPC AvailabilityZone: !Select [ 1, !GetAZs '' ] CidrBlock: !Ref PublicSubnet2CIDR MapPublicIpOnLaunch: true Tags: - Key: Name Value: !Sub ${EnvironmentName} Public Subnet (AZ2) PrivateSubnet1: Type: AWS::EC2::Subnet Properties: VpcId: !Ref VPC AvailabilityZone: !Select [ 0, !GetAZs '' ] CidrBlock: !Ref PrivateSubnet1CIDR MapPublicIpOnLaunch: false Tags: - Key: Name Value: !Sub ${EnvironmentName} Private Subnet (AZ1) PrivateSubnet2: Type: AWS::EC2::Subnet Properties: VpcId: !Ref VPC AvailabilityZone: !Select [ 1, !GetAZs '' ] CidrBlock: !Ref PrivateSubnet2CIDR MapPublicIpOnLaunch: false Step one: Save the AWS CloudFormation template locally 9 Amazon Managed Workflows for Apache Airflow User Guide Tags: - Key: Name Value: !Sub ${EnvironmentName} Private Subnet (AZ2) NatGateway1EIP: Type: AWS::EC2::EIP DependsOn: InternetGatewayAttachment Properties: Domain: vpc NatGateway2EIP: Type: AWS::EC2::EIP DependsOn: InternetGatewayAttachment Properties: Domain: vpc NatGateway1: Type: AWS::EC2::NatGateway Properties: AllocationId: !GetAtt NatGateway1EIP.AllocationId SubnetId: !Ref PublicSubnet1 NatGateway2: Type: AWS::EC2::NatGateway Properties: AllocationId: !GetAtt NatGateway2EIP.AllocationId SubnetId: !Ref PublicSubnet2 PublicRouteTable: Type: AWS::EC2::RouteTable Properties: VpcId: !Ref VPC Tags: - Key: Name Value: !Sub ${EnvironmentName} Public Routes DefaultPublicRoute: Type: AWS::EC2::Route DependsOn: InternetGatewayAttachment Properties: RouteTableId: !Ref PublicRouteTable DestinationCidrBlock: 0.0.0.0/0 GatewayId: !Ref InternetGateway Step one: Save the AWS CloudFormation template locally 10 Amazon Managed Workflows for Apache Airflow User Guide PublicSubnet1RouteTableAssociation: Type: AWS::EC2::SubnetRouteTableAssociation Properties: RouteTableId: !Ref PublicRouteTable SubnetId: !Ref PublicSubnet1 PublicSubnet2RouteTableAssociation: Type: AWS::EC2::SubnetRouteTableAssociation Properties: RouteTableId: !Ref PublicRouteTable SubnetId: !Ref PublicSubnet2 PrivateRouteTable1: Type: AWS::EC2::RouteTable Properties: VpcId: !Ref VPC Tags: - Key: Name Value: !Sub ${EnvironmentName} Private Routes (AZ1) DefaultPrivateRoute1: Type: AWS::EC2::Route Properties: RouteTableId: !Ref PrivateRouteTable1 DestinationCidrBlock: 0.0.0.0/0 NatGatewayId: !Ref NatGateway1 PrivateSubnet1RouteTableAssociation: Type: AWS::EC2::SubnetRouteTableAssociation Properties: RouteTableId: !Ref PrivateRouteTable1 SubnetId: !Ref PrivateSubnet1 PrivateRouteTable2: Type: AWS::EC2::RouteTable Properties: VpcId: !Ref VPC Tags: - Key: Name Value: !Sub ${EnvironmentName} Private Routes (AZ2) DefaultPrivateRoute2: Type: AWS::EC2::Route Step one: Save the AWS CloudFormation template locally 11 Amazon Managed Workflows for Apache Airflow User Guide Properties: RouteTableId: !Ref PrivateRouteTable2 DestinationCidrBlock: 0.0.0.0/0 NatGatewayId: !Ref NatGateway2 PrivateSubnet2RouteTableAssociation: Type: AWS::EC2::SubnetRouteTableAssociation Properties: RouteTableId: !Ref PrivateRouteTable2 SubnetId: !Ref PrivateSubnet2 SecurityGroup: Type: AWS::EC2::SecurityGroup Properties: GroupName: "mwaa-security-group" GroupDescription: "Security group with a self-referencing inbound rule." VpcId: !Ref VPC SecurityGroupIngress: Type: AWS::EC2::SecurityGroupIngress Properties: GroupId: !Ref SecurityGroup IpProtocol: "-1" SourceSecurityGroupId: !Ref SecurityGroup EnvironmentBucket: Type: AWS::S3::Bucket Properties: VersioningConfiguration: Status: Enabled PublicAccessBlockConfiguration: BlockPublicAcls: true BlockPublicPolicy: true IgnorePublicAcls: true RestrictPublicBuckets: true ##################################################################################################################### # CREATE MWAA ##################################################################################################################### MwaaEnvironment: Type: AWS::MWAA::Environment Step one: Save the AWS CloudFormation template locally 12 Amazon Managed Workflows for Apache Airflow User Guide DependsOn: MwaaExecutionPolicy Properties: Name: !Sub "${AWS::StackName}-MwaaEnvironment" SourceBucketArn: !GetAtt EnvironmentBucket.Arn ExecutionRoleArn: !GetAtt MwaaExecutionRole.Arn DagS3Path: dags/ NetworkConfiguration: SecurityGroupIds: - !GetAtt SecurityGroup.GroupId SubnetIds: - !Ref PrivateSubnet1 - !Ref PrivateSubnet2 WebserverAccessMode: PUBLIC_ONLY MaxWorkers: !Ref MaxWorkerNodes LoggingConfiguration: DagProcessingLogs:
amazon-mwaa-user-guide-009
amazon-mwaa-user-guide.pdf
9
Properties: GroupName: "mwaa-security-group" GroupDescription: "Security group with a self-referencing inbound rule." VpcId: !Ref VPC SecurityGroupIngress: Type: AWS::EC2::SecurityGroupIngress Properties: GroupId: !Ref SecurityGroup IpProtocol: "-1" SourceSecurityGroupId: !Ref SecurityGroup EnvironmentBucket: Type: AWS::S3::Bucket Properties: VersioningConfiguration: Status: Enabled PublicAccessBlockConfiguration: BlockPublicAcls: true BlockPublicPolicy: true IgnorePublicAcls: true RestrictPublicBuckets: true ##################################################################################################################### # CREATE MWAA ##################################################################################################################### MwaaEnvironment: Type: AWS::MWAA::Environment Step one: Save the AWS CloudFormation template locally 12 Amazon Managed Workflows for Apache Airflow User Guide DependsOn: MwaaExecutionPolicy Properties: Name: !Sub "${AWS::StackName}-MwaaEnvironment" SourceBucketArn: !GetAtt EnvironmentBucket.Arn ExecutionRoleArn: !GetAtt MwaaExecutionRole.Arn DagS3Path: dags/ NetworkConfiguration: SecurityGroupIds: - !GetAtt SecurityGroup.GroupId SubnetIds: - !Ref PrivateSubnet1 - !Ref PrivateSubnet2 WebserverAccessMode: PUBLIC_ONLY MaxWorkers: !Ref MaxWorkerNodes LoggingConfiguration: DagProcessingLogs: LogLevel: !Ref DagProcessingLogs Enabled: true SchedulerLogs: LogLevel: !Ref SchedulerLogsLevel Enabled: true TaskLogs: LogLevel: !Ref TaskLogsLevel Enabled: true WorkerLogs: LogLevel: !Ref WorkerLogsLevel Enabled: true WebserverLogs: LogLevel: !Ref WebserverLogsLevel Enabled: true MwaaExecutionRole: Type: AWS::IAM::Role Properties: AssumeRolePolicyDocument: Version: 2012-10-17 Statement: - Effect: Allow Principal: Service: - airflow-env.amazonaws.com - airflow.amazonaws.com Action: - "sts:AssumeRole" Step one: Save the AWS CloudFormation template locally 13 Amazon Managed Workflows for Apache Airflow User Guide Path: "/service-role/" MwaaExecutionPolicy: DependsOn: EnvironmentBucket Type: AWS::IAM::ManagedPolicy Properties: Roles: - !Ref MwaaExecutionRole PolicyDocument: Version: 2012-10-17 Statement: - Effect: Allow Action: airflow:PublishMetrics Resource: - !Sub "arn:aws:airflow:${AWS::Region}:${AWS::AccountId}:environment/ ${EnvironmentName}" - Effect: Deny Action: s3:ListAllMyBuckets Resource: - !Sub "${EnvironmentBucket.Arn}" - !Sub "${EnvironmentBucket.Arn}/*" - Effect: Allow Action: - "s3:GetObject*" - "s3:GetBucket*" - "s3:List*" Resource: - !Sub "${EnvironmentBucket.Arn}" - !Sub "${EnvironmentBucket.Arn}/*" - Effect: Allow Action: - logs:DescribeLogGroups Resource: "*" - Effect: Allow Action: - logs:CreateLogStream - logs:CreateLogGroup - logs:PutLogEvents - logs:GetLogEvents - logs:GetLogRecord - logs:GetLogGroupFields - logs:GetQueryResults Step one: Save the AWS CloudFormation template locally 14 Amazon Managed Workflows for Apache Airflow User Guide - logs:DescribeLogGroups Resource: - !Sub "arn:aws:logs:${AWS::Region}:${AWS::AccountId}:log- group:airflow-${AWS::StackName}*" - Effect: Allow Action: cloudwatch:PutMetricData Resource: "*" - Effect: Allow Action: - sqs:ChangeMessageVisibility - sqs:DeleteMessage - sqs:GetQueueAttributes - sqs:GetQueueUrl - sqs:ReceiveMessage - sqs:SendMessage Resource: - !Sub "arn:aws:sqs:${AWS::Region}:*:airflow-celery-*" - Effect: Allow Action: - kms:Decrypt - kms:DescribeKey - "kms:GenerateDataKey*" - kms:Encrypt NotResource: !Sub "arn:aws:kms:*:${AWS::AccountId}:key/*" Condition: StringLike: "kms:ViaService": - !Sub "sqs.${AWS::Region}.amazonaws.com" Outputs: VPC: Description: A reference to the created VPC Value: !Ref VPC PublicSubnets: Description: A list of the public subnets Value: !Join [ ",", [ !Ref PublicSubnet1, !Ref PublicSubnet2 ]] PrivateSubnets: Description: A list of the private subnets Value: !Join [ ",", [ !Ref PrivateSubnet1, !Ref PrivateSubnet2 ]] PublicSubnet1: Description: A reference to the public subnet in the 1st Availability Zone Value: !Ref PublicSubnet1 Step one: Save the AWS CloudFormation template locally 15 Amazon Managed Workflows for Apache Airflow User Guide PublicSubnet2: Description: A reference to the public subnet in the 2nd Availability Zone Value: !Ref PublicSubnet2 PrivateSubnet1: Description: A reference to the private subnet in the 1st Availability Zone Value: !Ref PrivateSubnet1 PrivateSubnet2: Description: A reference to the private subnet in the 2nd Availability Zone Value: !Ref PrivateSubnet2 SecurityGroupIngress: Description: Security group with self-referencing inbound rule Value: !Ref SecurityGroupIngress MwaaApacheAirflowUI: Description: MWAA Environment Value: !Sub "https://${MwaaEnvironment.WebserverUrl}" Step two: Create the stack using the AWS CLI 1. In your command prompt, navigate to the directory where mwaa-public-network.yml is stored. For example: cd mwaaproject 2. Use the aws cloudformation create-stack command to create the stack using the AWS CLI. aws cloudformation create-stack --stack-name mwaa-environment-public-network -- template-body file://mwaa-public-network.yml --capabilities CAPABILITY_IAM Note It takes over 30 minutes to create the Amazon VPC infrastructure, Amazon S3 bucket, and Amazon MWAA environment. Step two: Create the stack using the AWS CLI 16 Amazon Managed Workflows for Apache Airflow User Guide Step three: Upload a DAG to Amazon S3 and run in the Apache Airflow UI 1. Copy the contents of the tutorial.py file for the latest supported Apache Airflow version and save locally as tutorial.py. 2. In your command prompt, navigate to the directory where tutorial.py is stored. For example: cd mwaaproject 3. Use the following command to list all of your Amazon S3 buckets. aws s3 ls 4. Use the following command to list the files and folders in the Amazon S3 bucket for your environment. aws s3 ls s3://YOUR_S3_BUCKET_NAME 5. Use the following script to upload the tutorial.py file to your dags folder. Substitute the sample value in YOUR_S3_BUCKET_NAME. aws s3 cp tutorial.py s3://YOUR_S3_BUCKET_NAME/dags/ 6. Open the Environments page on the Amazon MWAA console. 7. Choose an environment. 8. Choose Open Airflow UI. 9. On the Apache Airflow UI, from the list of available DAGs, choose the tutorial DAG. 10. On the DAG details page, choose the Pause/Unpause DAG toggle next to your DAG name to unpause the DAG. 11. Choose Trigger DAG. Step three: Upload a DAG to Amazon S3 and run in the Apache Airflow UI 17 Amazon Managed Workflows for Apache Airflow User Guide Step four: View logs in CloudWatch Logs You can view Apache Airflow logs in the CloudWatch console for all of the Apache Airflow logs that were enabled by the AWS CloudFormation stack. The following section shows how to view logs for the Airflow web server log group.
amazon-mwaa-user-guide-010
amazon-mwaa-user-guide.pdf
10
DAGs, choose the tutorial DAG. 10. On the DAG details page, choose the Pause/Unpause DAG toggle next to your DAG name to unpause the DAG. 11. Choose Trigger DAG. Step three: Upload a DAG to Amazon S3 and run in the Apache Airflow UI 17 Amazon Managed Workflows for Apache Airflow User Guide Step four: View logs in CloudWatch Logs You can view Apache Airflow logs in the CloudWatch console for all of the Apache Airflow logs that were enabled by the AWS CloudFormation stack. The following section shows how to view logs for the Airflow web server log group. 1. Open the Environments page on the Amazon MWAA console. 2. Choose an environment. 3. Choose the Airflow web server log group on the Monitoring pane. 4. Choose the webserver_console_ip log in Log streams. What's next? • Learn more about how to upload DAGs, specify Python dependencies in a requirements.txt and custom plugins in a plugins.zip in Working with DAGs on Amazon MWAA. • Learn more about the best practices we recommend to tune the performance of your environment in Performance tuning for Apache Airflow on Amazon MWAA. • Create a monitoring dashboard for your environment in Monitoring dashboards and alarms on Amazon MWAA. • Run some of the DAG code samples in Code examples for Amazon Managed Workflows for Apache Airflow. Step four: View logs in CloudWatch Logs 18 Amazon Managed Workflows for Apache Airflow User Guide Get started with Amazon Managed Workflows for Apache Airflow Amazon Managed Workflows for Apache Airflow uses the Amazon VPC, DAG code and supporting files in your Amazon S3 storage bucket to create an environment. This chapter describes the prerequisites and AWS resources needed to get started with Amazon MWAA. Topics • Prerequisites • About this guide • Before you begin • Available regions • Create an Amazon S3 bucket for Amazon MWAA • Create the VPC network • Create an Amazon MWAA environment • What's next? Prerequisites To create an Amazon MWAA environment, you may want to take additional steps to ensure you have permission to the AWS resources you need to create. • AWS account – An AWS account with permission to use Amazon MWAA and the AWS services and resources used by your environment. About this guide This section describes the AWS infrastructure and resources you'll create in this guide. • Amazon VPC – The Amazon VPC networking components required by an Amazon MWAA environment. You can configure an existing VPC that meets these requirements (advanced) as seen in About networking on Amazon MWAA, or create the VPC and networking components, as defined in the section called “Create the VPC network”. Prerequisites 19 Amazon Managed Workflows for Apache Airflow User Guide • Amazon S3 bucket – An Amazon S3 bucket to store your DAGs and associated files, such as plugins.zip and requirements.txt. Your Amazon S3 bucket must be configured to Block all public access, with Bucket Versioning enabled, as defined in Create an Amazon S3 bucket for Amazon MWAA. • Amazon MWAA environment – An Amazon MWAA environment configured with the location of your Amazon S3 bucket, the path to your DAG code and any custom plugins or Python dependencies, and your Amazon VPC and its security group, as defined in Create an Amazon MWAA environment. Before you begin To create an Amazon MWAA environment, you may want to take additional steps to create and configure other AWS resources before you create your environment. To create an environment, you need the following: • AWS KMS key – An AWS KMS key for data encryption on your environment. You can choose the default option on the Amazon MWAA console to create an AWS owned key when you create an environment, or specify an existing Customer managed key with permissions to other AWS services used by your environment configured (advanced). To learn more, see Using customer managed keys for encryption. • Execution role – An execution role that allows Amazon MWAA to access AWS resources in your environment. You can choose the default option on the Amazon MWAA console to create an execution role when you create an environment. To learn more, see Amazon MWAA execution role. • VPC security group – A VPC security group that allows Amazon MWAA to access other AWS resources in your VPC network. You can choose the default option on the Amazon MWAA console to create a security group when you create an environment, or provide a security group with the appropriate inbound and outbound rules (advanced). To learn more, see Security in your VPC on Amazon MWAA. Available regions Amazon MWAA is available in the following AWS Regions. • Europe (Stockholm) - eu-north-1 Before you begin 20 Amazon Managed Workflows for Apache Airflow User Guide • Europe (Frankfurt) - eu-central-1 • Europe (Ireland) - eu-west-1
amazon-mwaa-user-guide-011
amazon-mwaa-user-guide.pdf
11
A VPC security group that allows Amazon MWAA to access other AWS resources in your VPC network. You can choose the default option on the Amazon MWAA console to create a security group when you create an environment, or provide a security group with the appropriate inbound and outbound rules (advanced). To learn more, see Security in your VPC on Amazon MWAA. Available regions Amazon MWAA is available in the following AWS Regions. • Europe (Stockholm) - eu-north-1 Before you begin 20 Amazon Managed Workflows for Apache Airflow User Guide • Europe (Frankfurt) - eu-central-1 • Europe (Ireland) - eu-west-1 • Europe (London) - eu-west-2 • Europe (Paris) - eu-west-3 • Asia Pacific (Mumbai) - ap-south-1 • Asia Pacific (Singapore) - ap-southeast-1 • Asia Pacific (Sydney) - ap-southeast-2 • Asia Pacific (Tokyo) - ap-northeast-1 • Asia Pacific (Seoul) - ap-northeast-2 • US East (N. Virginia) - us-east-1 • US East (Ohio) - us-east-2 • US West (Oregon) - us-west-2 • Canada (Central) - ca-central-1 • South America (São Paulo) - sa-east-1 Create an Amazon S3 bucket for Amazon MWAA This guide describes the steps to create an Amazon S3 bucket to store your Apache Airflow Directed Acyclic Graphs (DAGs), custom plugins in a plugins.zip file, and Python dependencies in a requirements.txt file. Contents • Before you begin • Create the bucket • What's next? Before you begin • The Amazon S3 bucket name can't be changed after you create the bucket. To learn more, see Rules for bucket naming in the Amazon Simple Storage Service User Guide. • An Amazon S3 bucket used for an Amazon MWAA environment must be configured to Block all public access, with Bucket Versioning enabled. Create a bucket 21 Amazon Managed Workflows for Apache Airflow User Guide • An Amazon S3 bucket used for an Amazon MWAA environment must be located in the same AWS Region as an Amazon MWAA environment. To view a list of AWS Regions for Amazon MWAA, see Amazon MWAA endpoints and quotas in the AWS General Reference. Create the bucket This section describes the steps to create the Amazon S3 bucket for your environment. To create a bucket 1. Sign in to the AWS Management Console and open the Amazon S3 console at https:// console.aws.amazon.com/s3/. 2. Choose Create bucket. 3. In Bucket name, enter a DNS-compliant name for your bucket. The bucket name must: • Be unique across all of Amazon S3. • Be between 3 and 63 characters long. • Not contain uppercase characters. • Start with a lowercase letter or number. Important Avoid including sensitive information, such as account numbers, in the bucket name. The bucket name is visible in the URLs that point to the objects in the bucket. 4. Choose an AWS Region in Region. This must be the same AWS Region as your Amazon MWAA environment. • We recommend choosing a region close to you to minimize latency and costs and address regulatory requirements. 5. Choose Block all public access. 6. Choose Enable in Bucket Versioning. 7. Optional - Tags. Add key-value tag pairs to identify your Amazon S3 bucket in Tags. For example, Bucket : Staging. Create the bucket 22 Amazon Managed Workflows for Apache Airflow User Guide 8. Optional - Server-side encryption. You can optionally Enable one of the following encryption options on your Amazon S3 bucket. a. Choose Amazon S3 key (SSE-S3) in Server-side encryption to enable server-side encryption for the bucket. b. Choose AWS Key Management Service key (SSE-KMS) to use an AWS KMS key for encryption on your Amazon S3 bucket: i. ii. AWS managed key (aws/s3) - If you choose this option, you can either use an AWS owned key managed by Amazon MWAA, or specify a Customer managed key for encryption of your Amazon MWAA environment. Choose from your AWS KMS keys or Enter AWS KMS key ARN - If you choose to specify a Customer managed key in this step, you must specify an AWS KMS key ID or ARN. AWS KMS aliases and multi-region keys are not supported by Amazon MWAA. The AWS KMS key you specify must also be used for encryption on your Amazon MWAA environment. 9. Optional - Advanced settings. If you want to enable Amazon S3 Object Lock: a. Choose Advanced settings, Enable. Important Enabling Object Lock will permanently allow objects in this bucket to be locked. To learn more, see Locking Objects Using Amazon S3 Object Lock in the Amazon Simple Storage Service User Guide. b. Choose the acknowledgement. 10. Choose Create bucket. What's next? • Learn how to create the required Amazon VPC network for an environment in Create the VPC network. • Learn how to how to manage access permissions in How do I set ACL bucket permissions? • Learn how to delete a storage bucket in How do
amazon-mwaa-user-guide-012
amazon-mwaa-user-guide.pdf
12
to enable Amazon S3 Object Lock: a. Choose Advanced settings, Enable. Important Enabling Object Lock will permanently allow objects in this bucket to be locked. To learn more, see Locking Objects Using Amazon S3 Object Lock in the Amazon Simple Storage Service User Guide. b. Choose the acknowledgement. 10. Choose Create bucket. What's next? • Learn how to create the required Amazon VPC network for an environment in Create the VPC network. • Learn how to how to manage access permissions in How do I set ACL bucket permissions? • Learn how to delete a storage bucket in How do I delete an S3 Bucket?. What's next? 23 Amazon Managed Workflows for Apache Airflow User Guide Create the VPC network Amazon Managed Workflows for Apache Airflow requires an Amazon VPC and specific networking components to support an environment. This guide describes the different options to create the Amazon VPC network for an Amazon Managed Workflows for Apache Airflow environment. Note Apache Airflow works best in a low-latency network environment. If you are using an existing Amazon VPC which routes traffic to another region or to an on-premise environment, we recommended adding AWS PrivateLink endpoints for Amazon SQS, CloudWatch, Amazon S3, and AWS KMS. For more information about configuring AWS PrivateLink for Amazon MWAA, see Creating an Amazon VPC network without internet access. Contents • Prerequisites • Before you begin • Options to create the Amazon VPC network • Option one: Creating the VPC network on the Amazon MWAA console • Option two: Creating an Amazon VPC network with Internet access • Option three: Creating an Amazon VPC network without Internet access • What's next? Prerequisites The AWS Command Line Interface (AWS CLI) is an open source tool that enables you to interact with AWS services using commands in your command-line shell. To complete the steps on this page, you need the following: • AWS CLI – Install version 2. • AWS CLI – Quick configuration with aws configure. Create the VPC network 24 Amazon Managed Workflows for Apache Airflow User Guide Before you begin • The VPC network you specify for your environment can't be changed after the environment is created. • You can use private or public routing for your Amazon VPC and Apache Airflow Web server. To view a list of options, see the section called “Example use cases for an Amazon VPC and Apache Airflow access mode”. Options to create the Amazon VPC network The following section describes the options available to create the Amazon VPC network for an environment. Note Amazon MWAA does not support the use of use1-az3 Availability Zone (AZ) in the US East (N. Virginia) Region. When creating the VPC for Amazon MWAA in the US East (N. Virginia) region, you must explicitly assign the AvailabilityZone in the AWS CloudFormation (CFN) template. The assigned availability zone name must not be mapped to use1-az3. You can retrieve the detailed mapping of AZ names to their corresponding AZ IDs by running the following command: aws ec2 describe-availability-zones --region us-east-1 Option one: Creating the VPC network on the Amazon MWAA console The following section shows how to create an Amazon VPC network on the Amazon MWAA console. This option uses Public routing over the Internet. It can be used for an Apache Airflow Web server with the Private network or Public network access modes. The following image shows where you can find the Create MWAA VPC button on the Amazon MWAA console. Before you begin 25 Amazon Managed Workflows for Apache Airflow User Guide Option two: Creating an Amazon VPC network with Internet access The following AWS CloudFormation template creates an Amazon VPC network with Internet access in your default AWS Region. This option uses Public routing over the Internet. This template can be used for an Apache Airflow Web server with the Private network or Public network access modes. 1. Copy the contents of the following template and save locally as cfn-vpc-public- private.yaml. You can also download the template. Description: This template deploys a VPC, with a pair of public and private subnets spread across two Availability Zones. It deploys an internet gateway, with a default route on the public subnets. It deploys a pair of NAT gateways (one in each AZ), and default routes for them in the private subnets. Parameters: EnvironmentName: Description: An environment name that is prefixed to resource names Type: String Default: mwaa- VpcCIDR: Description: Please enter the IP range (CIDR notation) for this VPC Type: String Default: 10.192.0.0/16 PublicSubnet1CIDR: Description: Please enter the IP range (CIDR notation) for the public subnet in the first Availability Zone Options to create the Amazon VPC network 26 Amazon Managed Workflows for Apache Airflow User Guide Type: String Default: 10.192.10.0/24 PublicSubnet2CIDR: Description: Please enter the IP range (CIDR notation) for the public subnet in
amazon-mwaa-user-guide-013
amazon-mwaa-user-guide.pdf
13
of NAT gateways (one in each AZ), and default routes for them in the private subnets. Parameters: EnvironmentName: Description: An environment name that is prefixed to resource names Type: String Default: mwaa- VpcCIDR: Description: Please enter the IP range (CIDR notation) for this VPC Type: String Default: 10.192.0.0/16 PublicSubnet1CIDR: Description: Please enter the IP range (CIDR notation) for the public subnet in the first Availability Zone Options to create the Amazon VPC network 26 Amazon Managed Workflows for Apache Airflow User Guide Type: String Default: 10.192.10.0/24 PublicSubnet2CIDR: Description: Please enter the IP range (CIDR notation) for the public subnet in the second Availability Zone Type: String Default: 10.192.11.0/24 PrivateSubnet1CIDR: Description: Please enter the IP range (CIDR notation) for the private subnet in the first Availability Zone Type: String Default: 10.192.20.0/24 PrivateSubnet2CIDR: Description: Please enter the IP range (CIDR notation) for the private subnet in the second Availability Zone Type: String Default: 10.192.21.0/24 Resources: VPC: Type: AWS::EC2::VPC Properties: CidrBlock: !Ref VpcCIDR EnableDnsSupport: true EnableDnsHostnames: true Tags: - Key: Name Value: !Ref EnvironmentName InternetGateway: Type: AWS::EC2::InternetGateway Properties: Tags: - Key: Name Value: !Ref EnvironmentName InternetGatewayAttachment: Type: AWS::EC2::VPCGatewayAttachment Properties: InternetGatewayId: !Ref InternetGateway VpcId: !Ref VPC Options to create the Amazon VPC network 27 Amazon Managed Workflows for Apache Airflow User Guide PublicSubnet1: Type: AWS::EC2::Subnet Properties: VpcId: !Ref VPC AvailabilityZone: !Select [ 0, !GetAZs '' ] CidrBlock: !Ref PublicSubnet1CIDR MapPublicIpOnLaunch: true Tags: - Key: Name Value: !Sub ${EnvironmentName} Public Subnet (AZ1) PublicSubnet2: Type: AWS::EC2::Subnet Properties: VpcId: !Ref VPC AvailabilityZone: !Select [ 1, !GetAZs '' ] CidrBlock: !Ref PublicSubnet2CIDR MapPublicIpOnLaunch: true Tags: - Key: Name Value: !Sub ${EnvironmentName} Public Subnet (AZ2) PrivateSubnet1: Type: AWS::EC2::Subnet Properties: VpcId: !Ref VPC AvailabilityZone: !Select [ 0, !GetAZs '' ] CidrBlock: !Ref PrivateSubnet1CIDR MapPublicIpOnLaunch: false Tags: - Key: Name Value: !Sub ${EnvironmentName} Private Subnet (AZ1) PrivateSubnet2: Type: AWS::EC2::Subnet Properties: VpcId: !Ref VPC AvailabilityZone: !Select [ 1, !GetAZs '' ] CidrBlock: !Ref PrivateSubnet2CIDR MapPublicIpOnLaunch: false Tags: - Key: Name Value: !Sub ${EnvironmentName} Private Subnet (AZ2) Options to create the Amazon VPC network 28 Amazon Managed Workflows for Apache Airflow User Guide NatGateway1EIP: Type: AWS::EC2::EIP DependsOn: InternetGatewayAttachment Properties: Domain: vpc NatGateway2EIP: Type: AWS::EC2::EIP DependsOn: InternetGatewayAttachment Properties: Domain: vpc NatGateway1: Type: AWS::EC2::NatGateway Properties: AllocationId: !GetAtt NatGateway1EIP.AllocationId SubnetId: !Ref PublicSubnet1 NatGateway2: Type: AWS::EC2::NatGateway Properties: AllocationId: !GetAtt NatGateway2EIP.AllocationId SubnetId: !Ref PublicSubnet2 PublicRouteTable: Type: AWS::EC2::RouteTable Properties: VpcId: !Ref VPC Tags: - Key: Name Value: !Sub ${EnvironmentName} Public Routes DefaultPublicRoute: Type: AWS::EC2::Route DependsOn: InternetGatewayAttachment Properties: RouteTableId: !Ref PublicRouteTable DestinationCidrBlock: 0.0.0.0/0 GatewayId: !Ref InternetGateway PublicSubnet1RouteTableAssociation: Type: AWS::EC2::SubnetRouteTableAssociation Properties: Options to create the Amazon VPC network 29 Amazon Managed Workflows for Apache Airflow User Guide RouteTableId: !Ref PublicRouteTable SubnetId: !Ref PublicSubnet1 PublicSubnet2RouteTableAssociation: Type: AWS::EC2::SubnetRouteTableAssociation Properties: RouteTableId: !Ref PublicRouteTable SubnetId: !Ref PublicSubnet2 PrivateRouteTable1: Type: AWS::EC2::RouteTable Properties: VpcId: !Ref VPC Tags: - Key: Name Value: !Sub ${EnvironmentName} Private Routes (AZ1) DefaultPrivateRoute1: Type: AWS::EC2::Route Properties: RouteTableId: !Ref PrivateRouteTable1 DestinationCidrBlock: 0.0.0.0/0 NatGatewayId: !Ref NatGateway1 PrivateSubnet1RouteTableAssociation: Type: AWS::EC2::SubnetRouteTableAssociation Properties: RouteTableId: !Ref PrivateRouteTable1 SubnetId: !Ref PrivateSubnet1 PrivateRouteTable2: Type: AWS::EC2::RouteTable Properties: VpcId: !Ref VPC Tags: - Key: Name Value: !Sub ${EnvironmentName} Private Routes (AZ2) DefaultPrivateRoute2: Type: AWS::EC2::Route Properties: RouteTableId: !Ref PrivateRouteTable2 DestinationCidrBlock: 0.0.0.0/0 Options to create the Amazon VPC network 30 Amazon Managed Workflows for Apache Airflow User Guide NatGatewayId: !Ref NatGateway2 PrivateSubnet2RouteTableAssociation: Type: AWS::EC2::SubnetRouteTableAssociation Properties: RouteTableId: !Ref PrivateRouteTable2 SubnetId: !Ref PrivateSubnet2 SecurityGroup: Type: AWS::EC2::SecurityGroup Properties: GroupName: "mwaa-security-group" GroupDescription: "Security group with a self-referencing inbound rule." VpcId: !Ref VPC SecurityGroupIngress: Type: AWS::EC2::SecurityGroupIngress Properties: GroupId: !Ref SecurityGroup IpProtocol: "-1" SourceSecurityGroupId: !Ref SecurityGroup Outputs: VPC: Description: A reference to the created VPC Value: !Ref VPC PublicSubnets: Description: A list of the public subnets Value: !Join [ ",", [ !Ref PublicSubnet1, !Ref PublicSubnet2 ]] PrivateSubnets: Description: A list of the private subnets Value: !Join [ ",", [ !Ref PrivateSubnet1, !Ref PrivateSubnet2 ]] PublicSubnet1: Description: A reference to the public subnet in the 1st Availability Zone Value: !Ref PublicSubnet1 PublicSubnet2: Description: A reference to the public subnet in the 2nd Availability Zone Value: !Ref PublicSubnet2 PrivateSubnet1: Options to create the Amazon VPC network 31 Amazon Managed Workflows for Apache Airflow User Guide Description: A reference to the private subnet in the 1st Availability Zone Value: !Ref PrivateSubnet1 PrivateSubnet2: Description: A reference to the private subnet in the 2nd Availability Zone Value: !Ref PrivateSubnet2 SecurityGroupIngress: Description: Security group with self-referencing inbound rule Value: !Ref SecurityGroupIngress 2. In your command prompt, navigate to the directory where cfn-vpc-public-private.yaml is stored. For example: cd mwaaproject 3. Use the aws cloudformation create-stack command to create the stack using the AWS CLI. aws cloudformation create-stack --stack-name mwaa-environment --template-body file://cfn-vpc-public-private.yaml Note It takes about 30 minutes to create the Amazon VPC infrastructure. Option three: Creating an Amazon VPC network without Internet access The following AWS CloudFormation template creates an Amazon VPC network without Internet access in your default AWS region. This option uses Private routing without Internet access. This template can be used for an Apache Airflow
amazon-mwaa-user-guide-014
amazon-mwaa-user-guide.pdf
14
rule Value: !Ref SecurityGroupIngress 2. In your command prompt, navigate to the directory where cfn-vpc-public-private.yaml is stored. For example: cd mwaaproject 3. Use the aws cloudformation create-stack command to create the stack using the AWS CLI. aws cloudformation create-stack --stack-name mwaa-environment --template-body file://cfn-vpc-public-private.yaml Note It takes about 30 minutes to create the Amazon VPC infrastructure. Option three: Creating an Amazon VPC network without Internet access The following AWS CloudFormation template creates an Amazon VPC network without Internet access in your default AWS region. This option uses Private routing without Internet access. This template can be used for an Apache Airflow Web server with the Private network access mode only. It creates the required VPC endpoints for the AWS services used by an environment. 1. Copy the contents of the following template and save locally as cfn-vpc-private.yaml. You can also download the template. AWSTemplateFormatVersion: "2010-09-09" Options to create the Amazon VPC network 32 Amazon Managed Workflows for Apache Airflow User Guide Parameters: VpcCIDR: Description: The IP range (CIDR notation) for this VPC Type: String Default: 10.192.0.0/16 PrivateSubnet1CIDR: Description: The IP range (CIDR notation) for the private subnet in the first Availability Zone Type: String Default: 10.192.10.0/24 PrivateSubnet2CIDR: Description: The IP range (CIDR notation) for the private subnet in the second Availability Zone Type: String Default: 10.192.11.0/24 Resources: VPC: Type: AWS::EC2::VPC Properties: CidrBlock: !Ref VpcCIDR EnableDnsSupport: true EnableDnsHostnames: true Tags: - Key: Name Value: !Ref AWS::StackName RouteTable: Type: AWS::EC2::RouteTable Properties: VpcId: !Ref VPC Tags: - Key: Name Value: !Sub "${AWS::StackName}-route-table" PrivateSubnet1: Type: AWS::EC2::Subnet Properties: VpcId: !Ref VPC AvailabilityZone: !Select [ 0, !GetAZs '' ] CidrBlock: !Ref PrivateSubnet1CIDR MapPublicIpOnLaunch: false Options to create the Amazon VPC network 33 Amazon Managed Workflows for Apache Airflow User Guide Tags: - Key: Name Value: !Sub "${AWS::StackName} Private Subnet (AZ1)" PrivateSubnet2: Type: AWS::EC2::Subnet Properties: VpcId: !Ref VPC AvailabilityZone: !Select [ 1, !GetAZs '' ] CidrBlock: !Ref PrivateSubnet2CIDR MapPublicIpOnLaunch: false Tags: - Key: Name Value: !Sub "${AWS::StackName} Private Subnet (AZ2)" PrivateSubnet1RouteTableAssociation: Type: AWS::EC2::SubnetRouteTableAssociation Properties: RouteTableId: !Ref RouteTable SubnetId: !Ref PrivateSubnet1 PrivateSubnet2RouteTableAssociation: Type: AWS::EC2::SubnetRouteTableAssociation Properties: RouteTableId: !Ref RouteTable SubnetId: !Ref PrivateSubnet2 S3VpcEndoint: Type: AWS::EC2::VPCEndpoint Properties: ServiceName: !Sub "com.amazonaws.${AWS::Region}.s3" VpcEndpointType: Gateway VpcId: !Ref VPC RouteTableIds: - !Ref RouteTable SecurityGroup: Type: AWS::EC2::SecurityGroup Properties: VpcId: !Ref VPC GroupDescription: Security Group for Amazon MWAA Environments to access VPC endpoints GroupName: !Sub "${AWS::StackName}-mwaa-vpc-endpoints" Options to create the Amazon VPC network 34 Amazon Managed Workflows for Apache Airflow User Guide SecurityGroupIngress: Type: AWS::EC2::SecurityGroupIngress Properties: GroupId: !Ref SecurityGroup IpProtocol: "-1" SourceSecurityGroupId: !Ref SecurityGroup SqsVpcEndoint: Type: AWS::EC2::VPCEndpoint Properties: ServiceName: !Sub "com.amazonaws.${AWS::Region}.sqs" VpcEndpointType: Interface VpcId: !Ref VPC PrivateDnsEnabled: true SubnetIds: - !Ref PrivateSubnet1 - !Ref PrivateSubnet2 SecurityGroupIds: - !Ref SecurityGroup CloudWatchLogsVpcEndoint: Type: AWS::EC2::VPCEndpoint Properties: ServiceName: !Sub "com.amazonaws.${AWS::Region}.logs" VpcEndpointType: Interface VpcId: !Ref VPC PrivateDnsEnabled: true SubnetIds: - !Ref PrivateSubnet1 - !Ref PrivateSubnet2 SecurityGroupIds: - !Ref SecurityGroup CloudWatchMonitoringVpcEndoint: Type: AWS::EC2::VPCEndpoint Properties: ServiceName: !Sub "com.amazonaws.${AWS::Region}.monitoring" VpcEndpointType: Interface VpcId: !Ref VPC PrivateDnsEnabled: true SubnetIds: - !Ref PrivateSubnet1 - !Ref PrivateSubnet2 SecurityGroupIds: Options to create the Amazon VPC network 35 Amazon Managed Workflows for Apache Airflow User Guide - !Ref SecurityGroup KmsVpcEndoint: Type: AWS::EC2::VPCEndpoint Properties: ServiceName: !Sub "com.amazonaws.${AWS::Region}.kms" VpcEndpointType: Interface VpcId: !Ref VPC PrivateDnsEnabled: true SubnetIds: - !Ref PrivateSubnet1 - !Ref PrivateSubnet2 SecurityGroupIds: - !Ref SecurityGroup Outputs: VPC: Description: A reference to the created VPC Value: !Ref VPC MwaaSecurityGroupId: Description: Associates the Security Group to the environment to allow access to the VPC endpoints Value: !Ref SecurityGroup PrivateSubnets: Description: A list of the private subnets Value: !Join [ ",", [ !Ref PrivateSubnet1, !Ref PrivateSubnet2 ]] PrivateSubnet1: Description: A reference to the private subnet in the 1st Availability Zone Value: !Ref PrivateSubnet1 PrivateSubnet2: Description: A reference to the private subnet in the 2nd Availability Zone Value: !Ref PrivateSubnet2 2. In your command prompt, navigate to the directory where cfn-vpc-private.yml is stored. For example: cd mwaaproject Options to create the Amazon VPC network 36 Amazon Managed Workflows for Apache Airflow User Guide 3. Use the aws cloudformation create-stack command to create the stack using the AWS CLI. aws cloudformation create-stack --stack-name mwaa-private-environment --template- body file://cfn-vpc-private.yml Note It takes about 30 minutes to create the Amazon VPC infrastructure. 4. You'll need to create a mechanism to access these VPC endpoints from your computer. To learn more, see Managing access to service-specific Amazon VPC endpoints on Amazon MWAA. Note You can further restrict outbound access in the CIDR of your Amazon MWAA security group. For example, you can restrict to itself by adding a self-referencing outbound rule, the prefix list for Amazon S3, and the CIDR of your Amazon VPC. What's next? • Learn how to create an Amazon MWAA environment in Create an Amazon MWAA environment. • Learn how to create a VPN tunnel from your computer to your Amazon VPC with private routing in Tutorial: Configuring private network access using an AWS Client VPN. Create an Amazon MWAA environment Amazon Managed Workflows for Apache Airflow
amazon-mwaa-user-guide-015
amazon-mwaa-user-guide.pdf
15
MWAA. Note You can further restrict outbound access in the CIDR of your Amazon MWAA security group. For example, you can restrict to itself by adding a self-referencing outbound rule, the prefix list for Amazon S3, and the CIDR of your Amazon VPC. What's next? • Learn how to create an Amazon MWAA environment in Create an Amazon MWAA environment. • Learn how to create a VPN tunnel from your computer to your Amazon VPC with private routing in Tutorial: Configuring private network access using an AWS Client VPN. Create an Amazon MWAA environment Amazon Managed Workflows for Apache Airflow sets up Apache Airflow on an environment in your chosen version using the same open-source Apache Airflow and user interface available from Apache. This guide describes the steps to create an Amazon MWAA environment. Contents • Before you begin • Apache Airflow versions • Create an environment What's next? 37 Amazon Managed Workflows for Apache Airflow User Guide • Step one: Specify details • Step two: Configure advanced settings • Step three: Review and create Before you begin • The VPC network you specify for your environment cannot be modified after the environment is created. • You need an Amazon S3 bucket configured to Block all public access, with Bucket Versioning enabled. • You need an AWS account with permissions to use Amazon MWAA, and permission in AWS Identity and Access Management (IAM) to create IAM roles. If you choose the Private network access mode for the Apache Airflow web server, which limits Apache Airflow access within your Amazon VPC, you'll need permission in IAM to create Amazon VPC endpoints. Apache Airflow versions The following Apache Airflow versions are supported on Amazon Managed Workflows for Apache Airflow. Note • Beginning with Apache Airflow v2.2.2, Amazon MWAA supports installing Python requirements, provider packages, and custom plugins directly on the Apache Airflow web server. • Beginning with Apache Airflow v2.7.2, your requirements file must include a -- constraint statement. If you do not provide a constraint, Amazon MWAA will specify one for you to ensure the packages listed in your requirements are compatible with the version of Apache Airflow you are using. For more information on setting up constraints in your requirements file, see Installing Python dependencies. Before you begin 38 Amazon Managed Workflows for Apache Airflow User Guide Apache Airflow version Apache Airflow guide Apache Airflow constraints Python version v2.10.3 v2.10.1 v2.9.2 v2.8.1 v2.7.2 v2.6.3 v2.5.1 v2.4.3 Apache Airflow v2.10.3 reference Apache Airflow v2.10.3 constraints Python 3.11 guide file Apache Airflow v2.10.1 reference Apache Airflow v2.10.1 constraints Python 3.11 guide file Apache Airflow v2.9.2 reference guide Apache Airflow v2.9.2 constraints file Python 3.11 Apache Airflow v2.8.1 reference guide Apache Airflow v2.8.1 constraints file Python 3.11 Apache Airflow v2.7.2 reference guide Apache Airflow v2.7.2 constraints file Python 3.11 Apache Airflow v2.6.3 reference guide Apache Airflow v2.6.3 constraints file Python 3.10 Apache Airflow v2.5.1 reference guide Apache Airflow v2.5.1 constraints file Python 3.10 Apache Airflow v2.4.3 reference guide Apache Airflow v2.4.3 constraints file Python 3.10 For more information about migrating your self-managed Apache Airflow deployments, or migrating an existing Amazon MWAA environment, including instructions for backing up your metadata database, see the Amazon MWAA Migration Guide. Create an environment The following section describes the steps to create an Amazon MWAA environment. Create an environment 39 Amazon Managed Workflows for Apache Airflow User Guide Step one: Specify details To specify details for the environment 1. Open the Amazon MWAA console. 2. Use the AWS Region selector to select your region. 3. Choose Create environment. 4. On the Specify details page, under Environment details: a. Type a unique name for your environment in Name. b. Choose the Apache Airflow version in Airflow version. Note If no value is specified, defaults to the latest Apache Airflow version. The latest version available is Apache Airflow v2.10.3. 5. Under DAG code in Amazon S3 specify the following: a. S3 Bucket. Choose Browse S3 and select your Amazon S3 bucket, or enter the Amazon S3 URI. b. DAGs folder. Choose Browse S3 and select the dags folder in your Amazon S3 bucket, or enter the Amazon S3 URI. c. Plugins file - optional. Choose Browse S3 and select the plugins.zip file on your Amazon S3 bucket, or enter the Amazon S3 URI. d. Requirements file - optional. Choose Browse S3 and select the requirements.txt file on your Amazon S3 bucket, or enter the Amazon S3 URI. e. Startup script file - optional, Choose Browse S3 and select the script file on your Amazon S3 bucket, or enter the Amazon S3 URI. 6. Choose Next. Step two: Configure advanced settings To configure advanced settings 1. On the Configure advanced settings page, under Networking: Create an environment 40 Amazon Managed Workflows for Apache Airflow User Guide • Choose your Amazon VPC.
amazon-mwaa-user-guide-016
amazon-mwaa-user-guide.pdf
16
plugins.zip file on your Amazon S3 bucket, or enter the Amazon S3 URI. d. Requirements file - optional. Choose Browse S3 and select the requirements.txt file on your Amazon S3 bucket, or enter the Amazon S3 URI. e. Startup script file - optional, Choose Browse S3 and select the script file on your Amazon S3 bucket, or enter the Amazon S3 URI. 6. Choose Next. Step two: Configure advanced settings To configure advanced settings 1. On the Configure advanced settings page, under Networking: Create an environment 40 Amazon Managed Workflows for Apache Airflow User Guide • Choose your Amazon VPC. This step populates two of the private subnets in your Amazon VPC. 2. Under Web server access, select your preferred Apache Airflow access mode: a. Private network. This limits access of the Apache Airflow UI to users within your Amazon VPC that have been granted access to the IAM policy for your environment. You need permission to create Amazon VPC endpoints for this step. Note Choose the Private network option if your Apache Airflow UI is only accessed within a corporate network, and you do not require access to public repositories for web server requirements installation. If you choose this access mode option, you need to create a mechanism to access your Apache Airflow Web server in your Amazon VPC. For more information, see Accessing the VPC endpoint for your Apache Airflow Web server (private network access). b. Public network. This allows the Apache Airflow UI to be accessed over the Internet by users granted access to the IAM policy for your environment. 3. Under Security group(s), choose the security group used to secure your Amazon VPC: a. By default, Amazon MWAA creates a security group in your Amazon VPC with specific inbound and outbound rules in Create new security group. b. Optional. Deselect the check box in Create new security group to select up to 5 security groups. Note An existing Amazon VPC security group must be configured with specific inbound and outbound rules to allow network traffic. To learn more, see Security in your VPC on Amazon MWAA. 4. Under Environment class, choose an environment class. We recommend choosing the smallest size necessary to support your workload. You can change the environment class at any time. Create an environment 41 Amazon Managed Workflows for Apache Airflow User Guide 5. For Maximum worker count, specify the maximum number of Apache Airflow workers to run in the environment. For more information, see Example high performance use case. 6. Specify the Maximum web server count and Minimum web server count to configure how Amazon MWAA scales the Apache Airflow web servers in your environment. For more information about web server automatic scaling, see the section called “Configuring web server auto scaling”. 7. Under Encryption, choose a data encryption option: a. By default, Amazon MWAA uses an AWS owned key to encrypt your data. b. Optional. Choose Customize encryption settings (advanced) to choose a different AWS KMS key. If you choose to specify a Customer managed key in this step, you must specify an AWS KMS key ID or ARN. AWS KMS aliases and multi-region keys are not supported by Amazon MWAA. If you specified an Amazon S3 key for server-side encryption on your Amazon S3 bucket, you must specify the same key for your Amazon MWAA environment. Note You must have permissions to the key to select it on the Amazon MWAA console. You must also grant permissions for Amazon MWAA to use the key by attaching the policy described in Attach key policy. 8. Recommended. Under Monitoring, choose one or more log categories for Airflow logging configuration to send Apache Airflow logs to CloudWatch Logs: a. Airflow task logs. Choose the type of Apache Airflow task logs to send to CloudWatch Logs in Log level. b. Airflow web server logs. Choose the type of Apache Airflow web server logs to send to CloudWatch Logs in Log level. c. Airflow scheduler logs. Choose the type of Apache Airflow scheduler logs to send to CloudWatch Logs in Log level. d. Airflow worker logs. Choose the type of Apache Airflow worker logs to send to CloudWatch Logs in Log level. e. Airflow DAG processing logs. Choose the type of Apache Airflow DAG processing logs to send to CloudWatch Logs in Log level. Create an environment 42 Amazon Managed Workflows for Apache Airflow User Guide 9. Optional. For Airflow configuration options, choose Add custom configuration option. You can choose from the suggested dropdown list of Apache Airflow configuration options for your Apache Airflow version, or specify custom configuration options. For example, core.default_task_retries : 3. 10. Optional. Under Tags, choose Add new tag to associate tags to your environment. For example, Environment: Staging. 11. Under Permissions, choose an execution role: a. By default, Amazon MWAA
amazon-mwaa-user-guide-017
amazon-mwaa-user-guide.pdf
17
processing logs. Choose the type of Apache Airflow DAG processing logs to send to CloudWatch Logs in Log level. Create an environment 42 Amazon Managed Workflows for Apache Airflow User Guide 9. Optional. For Airflow configuration options, choose Add custom configuration option. You can choose from the suggested dropdown list of Apache Airflow configuration options for your Apache Airflow version, or specify custom configuration options. For example, core.default_task_retries : 3. 10. Optional. Under Tags, choose Add new tag to associate tags to your environment. For example, Environment: Staging. 11. Under Permissions, choose an execution role: a. By default, Amazon MWAA creates an execution role in Create a new role. You must have permission to create IAM roles to use this option. b. Optional. Choose Enter role ARN to enter the Amazon Resource Name (ARN) of an existing execution role. 12. Choose Next. Step three: Review and create To review an environment summary • Review the environment summary, choose Create environment. Note It takes about twenty to thirty minutes to create an environment. What's next? • Learn how to create an Amazon S3 bucket in Create an Amazon S3 bucket for Amazon MWAA. What's next? 43 Amazon Managed Workflows for Apache Airflow User Guide Managing access to an Amazon MWAA environment Amazon Managed Workflows for Apache Airflow needs to be permitted to use other AWS services and resources used by an environment. You also need to be granted permission to access an Amazon MWAA environment and your Apache Airflow UI in AWS Identity and Access Management (IAM). This section describes the execution role used to grant access to the AWS resources for your environment and how to add permissions, and the AWS account permissions you need to access your Amazon MWAA environment and Apache Airflow UI. Topics • Accessing an Amazon MWAA environment • Service-linked role for Amazon MWAA • Amazon MWAA execution role • Cross-service confused deputy prevention • Apache Airflow access modes Accessing an Amazon MWAA environment To use Amazon Managed Workflows for Apache Airflow, you must use an account, and IAM entities with the necessary permissions. This topic describes the access policies you can attach to your Apache Airflow development team and Apache Airflow users for your Amazon Managed Workflows for Apache Airflow environment. We recommend using temporary credentials and configuring federated identities with groups and roles, to access your Amazon MWAA resources. As a best practice, avoid attaching policies directly to your IAM users, and instead define groups or roles to provide temporary access to AWS resources. An IAM role is an IAM identity that you can create in your account that has specific permissions. An IAM role is similar to an IAM user in that it is an AWS identity with permissions policies that determine what the identity can and cannot do in AWS. However, instead of being uniquely associated with one person, a role is intended to be assumable by anyone who needs it. Also, a role does not have standard long-term credentials such as a password or access keys associated with it. Instead, when you assume a role, it provides you with temporary security credentials for your role session. Accessing an Amazon MWAA environment 44 Amazon Managed Workflows for Apache Airflow User Guide To assign permissions to a federated identity, you create a role and define permissions for the role. When a federated identity authenticates, the identity is associated with the role and is granted the permissions that are defined by the role. For information about roles for federation, see Create a role for a third-party identity provider (federation) in the IAM User Guide. If you use IAM Identity Center, you configure a permission set. To control what your identities can access after they authenticate, IAM Identity Center correlates the permission set to a role in IAM. For information about permissions sets, see Permission sets in the AWS IAM Identity Center User Guide. You can use an IAM role in your account to grant another AWS account permissions to access your account's resources. For an example, see Tutorial: Delegate access across AWS accounts using IAM roles in the IAM User Guide. Sections • How it works • Full console access policy: AmazonMWAAFullConsoleAccess • Full API and console access policy: AmazonMWAAFullApiAccess • Read-only console access policy: AmazonMWAAReadOnlyAccess • Apache Airflow UI access policy: AmazonMWAAWebServerAccess • Apache Airflow Rest API access policy: AmazonMWAARestAPIAccess • Apache Airflow CLI policy: AmazonMWAAAirflowCliAccess • Creating a JSON policy • Example use case to attach policies to a developer group • What's next? How it works The resources and services used in an Amazon MWAA environment are not accessible to all AWS Identity and Access Management (IAM) entities. You must create a policy that grants Apache Airflow users permission to access these resources. For example, you need to grant access to
amazon-mwaa-user-guide-018
amazon-mwaa-user-guide.pdf
18
API and console access policy: AmazonMWAAFullApiAccess • Read-only console access policy: AmazonMWAAReadOnlyAccess • Apache Airflow UI access policy: AmazonMWAAWebServerAccess • Apache Airflow Rest API access policy: AmazonMWAARestAPIAccess • Apache Airflow CLI policy: AmazonMWAAAirflowCliAccess • Creating a JSON policy • Example use case to attach policies to a developer group • What's next? How it works The resources and services used in an Amazon MWAA environment are not accessible to all AWS Identity and Access Management (IAM) entities. You must create a policy that grants Apache Airflow users permission to access these resources. For example, you need to grant access to your Apache Airflow development team. Amazon MWAA uses these policies to validate whether a user has the permissions needed to perform an action on the AWS console or via the APIs used by an environment. You can use the JSON policies in this topic to create a policy for your Apache Airflow users in IAM, and then attach the policy to a user, group, or role in IAM. How it works 45 Amazon Managed Workflows for Apache Airflow User Guide • AmazonMWAAFullConsoleAccess – Use this policy to grant permission to configure an environment on the Amazon MWAA console. • AmazonMWAAFullApiAccess – Use this policy to grant access to all Amazon MWAA APIs used to manage an environment. • AmazonMWAAReadOnlyAccess – Use this policy to grant access to to view the resources used by an environment on the Amazon MWAA console. • AmazonMWAAWebServerAccess – Use this policy to grant access to the Apache Airflow web server. • AmazonMWAAAirflowCliAccess – Use this policy to grant access to run Apache Airflow CLI commands. To provide access, add permissions to your users, groups, or roles: • Users and groups in AWS IAM Identity Center: Create a permission set. Follow the instructions in Create a permission set in the AWS IAM Identity Center User Guide. • Users managed in IAM through an identity provider: Create a role for identity federation. Follow the instructions in Create a role for a third-party identity provider (federation) in the IAM User Guide. • IAM users: • Create a role that your user can assume. Follow the instructions in Create a role for an IAM user in the IAM User Guide. • (Not recommended) Attach a policy directly to a user or add a user to a user group. Follow the instructions in Adding permissions to a user (console) in the IAM User Guide. Full console access policy: AmazonMWAAFullConsoleAccess A user may need access to the AmazonMWAAFullConsoleAccess permissions policy if they need to configure an environment on the Amazon MWAA console. Note Your full console access policy must include permissions to perform iam:PassRole. This allows the user to pass service-linked roles, and execution roles, to Amazon MWAA. Amazon Full console access 46 Amazon Managed Workflows for Apache Airflow User Guide MWAA assumes each role in order to call other AWS services on your behalf. The following example uses the iam:PassedToService condition key to specify the Amazon MWAA service principal (airflow.amazonaws.com) as the service to which a role can be passed. For more information about iam:PassRole, see Granting a user permissions to pass a role to an AWS service in the IAM User Guide. Use the following policy if you want to create, and manage, your Amazon MWAA environments using an AWS owned key for encryption at-rest. Using an AWS owned key { "Version":"2012-10-17", "Statement":[ { "Effect":"Allow", "Action":"airflow:*", "Resource":"*" }, { "Effect":"Allow", "Action":[ "iam:PassRole" ], "Resource":"*", "Condition":{ "StringLike":{ "iam:PassedToService":"airflow.amazonaws.com" } } }, { "Effect":"Allow", "Action":[ "iam:ListRoles" ], "Resource":"*" }, { "Effect":"Allow", "Action":[ Full console access 47 Amazon Managed Workflows for Apache Airflow User Guide "iam:CreatePolicy" ], "Resource":"arn:aws:iam::YOUR_ACCOUNT_ID:policy/service-role/MWAA-Execution- Policy*" }, { "Effect":"Allow", "Action":[ "iam:AttachRolePolicy", "iam:CreateRole" ], "Resource":"arn:aws:iam::YOUR_ACCOUNT_ID:role/service-role/AmazonMWAA*" }, { "Effect":"Allow", "Action":[ "iam:CreateServiceLinkedRole" ], "Resource":"arn:aws:iam::*:role/aws-service-role/airflow.amazonaws.com/ AWSServiceRoleForAmazonMWAA" }, { "Effect":"Allow", "Action":[ "s3:GetBucketLocation", "s3:ListAllMyBuckets", "s3:ListBucket", "s3:ListBucketVersions" ], "Resource":"*" }, { "Effect":"Allow", "Action":[ "s3:CreateBucket", "s3:PutObject", "s3:GetEncryptionConfiguration" ], "Resource":"arn:aws:s3:::*" }, { "Effect":"Allow", "Action":[ "ec2:DescribeSecurityGroups", Full console access 48 Amazon Managed Workflows for Apache Airflow User Guide "ec2:DescribeSubnets", "ec2:DescribeVpcs", "ec2:DescribeRouteTables" ], "Resource":"*" }, { "Effect":"Allow", "Action":[ "ec2:AuthorizeSecurityGroupIngress", "ec2:CreateSecurityGroup" ], "Resource":"arn:aws:ec2:*:*:security-group/airflow-security-group-*" }, { "Effect":"Allow", "Action":[ "kms:ListAliases" ], "Resource":"*" }, { "Effect":"Allow", "Action":"ec2:CreateVpcEndpoint", "Resource":[ "arn:aws:ec2:*:*:vpc-endpoint/*", "arn:aws:ec2:*:*:vpc/*", "arn:aws:ec2:*:*:subnet/*", "arn:aws:ec2:*:*:security-group/*" ] }, { "Effect":"Allow", "Action":[ "ec2:CreateNetworkInterface" ], "Resource":[ "arn:aws:ec2:*:*:subnet/*", "arn:aws:ec2:*:*:network-interface/*" ] } ] } Full console access 49 Amazon Managed Workflows for Apache Airflow User Guide Use the following policy if you want to create, and manage, your Amazon MWAA environments using a customer managed key for encryption at-rest. To use a customer managed key, the IAM principal must have permission to access AWS KMS resources using the key stored in your account. Using a customer managed key { "Version":"2012-10-17", "Statement":[ { "Effect":"Allow", "Action":"airflow:*", "Resource":"*" }, { "Effect":"Allow", "Action":[ "iam:PassRole" ], "Resource":"*", "Condition":{ "StringLike":{ "iam:PassedToService":"airflow.amazonaws.com" } } }, { "Effect":"Allow", "Action":[ "iam:ListRoles" ], "Resource":"*" }, { "Effect":"Allow", "Action":[ "iam:CreatePolicy" ], "Resource":"arn:aws:iam::YOUR_ACCOUNT_ID:policy/service-role/MWAA-Execution- Policy*"
amazon-mwaa-user-guide-019
amazon-mwaa-user-guide.pdf
19
] } Full console access 49 Amazon Managed Workflows for Apache Airflow User Guide Use the following policy if you want to create, and manage, your Amazon MWAA environments using a customer managed key for encryption at-rest. To use a customer managed key, the IAM principal must have permission to access AWS KMS resources using the key stored in your account. Using a customer managed key { "Version":"2012-10-17", "Statement":[ { "Effect":"Allow", "Action":"airflow:*", "Resource":"*" }, { "Effect":"Allow", "Action":[ "iam:PassRole" ], "Resource":"*", "Condition":{ "StringLike":{ "iam:PassedToService":"airflow.amazonaws.com" } } }, { "Effect":"Allow", "Action":[ "iam:ListRoles" ], "Resource":"*" }, { "Effect":"Allow", "Action":[ "iam:CreatePolicy" ], "Resource":"arn:aws:iam::YOUR_ACCOUNT_ID:policy/service-role/MWAA-Execution- Policy*" }, { "Effect":"Allow", Full console access 50 Amazon Managed Workflows for Apache Airflow User Guide "Action":[ "iam:AttachRolePolicy", "iam:CreateRole" ], "Resource":"arn:aws:iam::YOUR_ACCOUNT_ID:role/service-role/AmazonMWAA*" }, { "Effect":"Allow", "Action":[ "iam:CreateServiceLinkedRole" ], "Resource":"arn:aws:iam::*:role/aws-service-role/airflow.amazonaws.com/ AWSServiceRoleForAmazonMWAA" }, { "Effect":"Allow", "Action":[ "s3:GetBucketLocation", "s3:ListAllMyBuckets", "s3:ListBucket", "s3:ListBucketVersions" ], "Resource":"*" }, { "Effect":"Allow", "Action":[ "s3:CreateBucket", "s3:PutObject", "s3:GetEncryptionConfiguration" ], "Resource":"arn:aws:s3:::*" }, { "Effect":"Allow", "Action":[ "ec2:DescribeSecurityGroups", "ec2:DescribeSubnets", "ec2:DescribeVpcs", "ec2:DescribeRouteTables" ], "Resource":"*" }, { Full console access 51 Amazon Managed Workflows for Apache Airflow "Effect":"Allow", "Action":[ "ec2:AuthorizeSecurityGroupIngress", "ec2:CreateSecurityGroup" ], "Resource":"arn:aws:ec2:*:*:security-group/airflow-security-group-*" User Guide }, { "Effect":"Allow", "Action":[ "kms:ListAliases" ], "Resource":"*" }, { "Effect":"Allow", "Action":[ "kms:DescribeKey", "kms:ListGrants", "kms:CreateGrant", "kms:RevokeGrant", "kms:Decrypt", "kms:Encrypt", "kms:GenerateDataKey*", "kms:ReEncrypt*" ], "Resource":"arn:aws:kms:*:YOUR_ACCOUNT_ID:key/YOUR_KMS_ID" }, { "Effect":"Allow", "Action":"ec2:CreateVpcEndpoint", "Resource":[ "arn:aws:ec2:*:*:vpc-endpoint/*", "arn:aws:ec2:*:*:vpc/*", "arn:aws:ec2:*:*:subnet/*", "arn:aws:ec2:*:*:security-group/*" ] }, { "Effect":"Allow", "Action":[ "ec2:CreateNetworkInterface" ], "Resource":[ Full console access 52 Amazon Managed Workflows for Apache Airflow User Guide "arn:aws:ec2:*:*:subnet/*", "arn:aws:ec2:*:*:network-interface/*" ] } ] } Full API and console access policy: AmazonMWAAFullApiAccess A user may need access to the AmazonMWAAFullApiAccess permissions policy if they need access to all Amazon MWAA APIs used to manage an environment. It does not grant permissions to access the Apache Airflow UI. Note A full API access policy must include permissions to perform iam:PassRole. This allows the user to pass service-linked roles, and execution roles, to Amazon MWAA. Amazon MWAA assumes each role in order to call other AWS services on your behalf. The following example uses the iam:PassedToService condition key to specify the Amazon MWAA service principal (airflow.amazonaws.com) as the service to which a role can be passed. For more information about iam:PassRole, see Granting a user permissions to pass a role to an AWS service in the IAM User Guide. Use the following policy if you want to create, and manage, your Amazon MWAA environments using an AWS owned key for encryption at-rest. Using an AWS owned key { "Version":"2012-10-17", "Statement":[ { "Effect":"Allow", "Action":"airflow:*", "Resource":"*" }, { "Effect":"Allow", "Action":[ "iam:PassRole" Full API access 53 Amazon Managed Workflows for Apache Airflow User Guide ], "Resource":"*", "Condition":{ "StringLike":{ "iam:PassedToService":"airflow.amazonaws.com" } } }, { "Effect":"Allow", "Action":[ "iam:CreateServiceLinkedRole" ], "Resource":"arn:aws:iam::*:role/aws-service-role/airflow.amazonaws.com/ AWSServiceRoleForAmazonMWAA" }, { "Effect":"Allow", "Action":[ "ec2:DescribeSecurityGroups", "ec2:DescribeSubnets", "ec2:DescribeVpcs", "ec2:DescribeRouteTables" ], "Resource":"*" }, { "Effect":"Allow", "Action":[ "s3:GetEncryptionConfiguration" ], "Resource":"arn:aws:s3:::*" }, { "Effect":"Allow", "Action":"ec2:CreateVpcEndpoint", "Resource":[ "arn:aws:ec2:*:*:vpc-endpoint/*", "arn:aws:ec2:*:*:vpc/*", "arn:aws:ec2:*:*:subnet/*", "arn:aws:ec2:*:*:security-group/*" ] }, { Full API access 54 Amazon Managed Workflows for Apache Airflow User Guide "Effect":"Allow", "Action":[ "ec2:CreateNetworkInterface" ], "Resource":[ "arn:aws:ec2:*:*:subnet/*", "arn:aws:ec2:*:*:network-interface/*" ] } ] } Use the following policy if you want to create, and manage, your Amazon MWAA environments using a customer managed key for encryption at-rest. To use a customer managed key, the IAM principal must have permission to access AWS KMS resources using the key stored in your account. Using a customer managed key { "Version":"2012-10-17", "Statement":[ { "Effect":"Allow", "Action":"airflow:*", "Resource":"*" }, { "Effect":"Allow", "Action":[ "iam:PassRole" ], "Resource":"*", "Condition":{ "StringLike":{ "iam:PassedToService":"airflow.amazonaws.com" } } }, { "Effect":"Allow", "Action":[ "iam:CreateServiceLinkedRole" ], Full API access 55 Amazon Managed Workflows for Apache Airflow User Guide "Resource":"arn:aws:iam::*:role/aws-service-role/airflow.amazonaws.com/ AWSServiceRoleForAmazonMWAA" }, { "Effect":"Allow", "Action":[ "ec2:DescribeSecurityGroups", "ec2:DescribeSubnets", "ec2:DescribeVpcs", "ec2:DescribeRouteTables" ], "Resource":"*" }, { "Effect":"Allow", "Action":[ "kms:DescribeKey", "kms:ListGrants", "kms:CreateGrant", "kms:RevokeGrant", "kms:Decrypt", "kms:Encrypt", "kms:GenerateDataKey*", "kms:ReEncrypt*" ], "Resource":"arn:aws:kms:*:YOUR_ACCOUNT_ID:key/YOUR_KMS_ID" }, { "Effect":"Allow", "Action":[ "s3:GetEncryptionConfiguration" ], "Resource":"arn:aws:s3:::*" }, { "Effect":"Allow", "Action":"ec2:CreateVpcEndpoint", "Resource":[ "arn:aws:ec2:*:*:vpc-endpoint/*", "arn:aws:ec2:*:*:vpc/*", "arn:aws:ec2:*:*:subnet/*", "arn:aws:ec2:*:*:security-group/*" ] }, Full API access 56 Amazon Managed Workflows for Apache Airflow User Guide { "Effect":"Allow", "Action":[ "ec2:CreateNetworkInterface" ], "Resource":[ "arn:aws:ec2:*:*:subnet/*", "arn:aws:ec2:*:*:network-interface/*" ] } ] } Read-only console access policy: AmazonMWAAReadOnlyAccess A user may need access to the AmazonMWAAReadOnlyAccess permissions policy if they need to view the resources used by an environment on the Amazon MWAA console environment details page. It doesn't allow a user to create new environments, edit existing environments, or allow a user to view the Apache Airflow UI. { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "airflow:ListEnvironments", "airflow:GetEnvironment", "airflow:ListTagsForResource" ], "Resource": "*" } ] } Apache Airflow UI access policy: AmazonMWAAWebServerAccess A user may need access to the AmazonMWAAWebServerAccess permissions policy if they need to access the Apache Airflow UI. It does not allow the user to view environments on the Amazon MWAA console or use the Amazon MWAA APIs to perform any actions. Specify the Admin, Op, User, Viewer or the Public role in {airflow-role} to customize the level of access for the Read-only console access 57 Amazon Managed Workflows for Apache Airflow User Guide user of the
amazon-mwaa-user-guide-020
amazon-mwaa-user-guide.pdf
20
"Statement": [ { "Effect": "Allow", "Action": [ "airflow:ListEnvironments", "airflow:GetEnvironment", "airflow:ListTagsForResource" ], "Resource": "*" } ] } Apache Airflow UI access policy: AmazonMWAAWebServerAccess A user may need access to the AmazonMWAAWebServerAccess permissions policy if they need to access the Apache Airflow UI. It does not allow the user to view environments on the Amazon MWAA console or use the Amazon MWAA APIs to perform any actions. Specify the Admin, Op, User, Viewer or the Public role in {airflow-role} to customize the level of access for the Read-only console access 57 Amazon Managed Workflows for Apache Airflow User Guide user of the web token. For more information, see Default Roles in the Apache Airflow reference guide. { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": "airflow:CreateWebLoginToken", "Resource": [ "arn:aws:airflow:{your-region}:YOUR_ACCOUNT_ID:role/{your-environment- name}/{airflow-role}" ] } ] } Note • Amazon MWAA provides IAM integration with the five default Apache Airflow role- based access control (RBAC) roles. For more information on working with custom Apache Airflow roles, see the section called “Tutorial: Restricting users to a subset of DAGs”. • The Resource field in this policy could be used to specify the Apache Airflow role-based access control roles for the Amazon MWAA environment. However, it does not support the Amazon MWAA environment ARN (Amazon Resource Name) in the Resource field of the policy. Apache Airflow Rest API access policy: AmazonMWAARestAPIAccess To access the Apache Airflow REST API, you must grant the airflow:InvokeRestApi permission in your IAM policy. In the following policy sample, specify the Admin, Op, User, Viewer or the Public role in {airflow-role} to customize the level of user access. For more information, see Default Roles in the Apache Airflow reference guide. { "Version": "2012-10-17", Apache Airflow Rest API access 58 Amazon Managed Workflows for Apache Airflow User Guide "Statement": [ { "Sid": "AllowMwaaRestApiAccess", "Effect": "Allow", "Action": "airflow:InvokeRestApi", "Resource": [ "arn:aws:airflow:{your-region}:YOUR_ACCOUNT_ID:role/{your-environment-name}/ {airflow-role}" ] } ] } Note • While configuring a private web server, the InvokeRestApi action cannot be invoked from outside of a Virtual Private Cloud (VPC). You can use the aws:SourceVpc key to apply more granular access control for this operation. For more information, see aws:SourceVpc • The Resource field in this policy could be used to specify the Apache Airflow role-based access control roles for the Amazon MWAA environment. However, it does not support the Amazon MWAA environment ARN (Amazon Resource Name) in the Resource field of the policy. Apache Airflow CLI policy: AmazonMWAAAirflowCliAccess A user may need access to the AmazonMWAAAirflowCliAccess permissions policy if they need to run Apache Airflow CLI commands (such as trigger_dag). It does not allow the user to view environments on the Amazon MWAA console or use the Amazon MWAA APIs to perform any actions. { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "airflow:CreateCliToken" Apache Airflow CLI access 59 Amazon Managed Workflows for Apache Airflow User Guide ], "Resource": "arn:aws:airflow:${Region}:${Account}:environment/ ${EnvironmentName}" } ] } Creating a JSON policy You can create the JSON policy, and attach the policy to your user, role, or group on the IAM console. The following steps describe how to create a JSON policy in IAM. To create the JSON policy 1. Open the Policies page on the IAM console. 2. Choose Create policy. 3. Choose the JSON tab. 4. Add your JSON policy. 5. Choose Review policy. 6. Enter a value in the text field for Name and Description (optional). For example, you could name the policy AmazonMWAAReadOnlyAccess. 7. Choose Create policy. Example use case to attach policies to a developer group Let's say you're using a group in IAM named AirflowDevelopmentGroup to apply permissions to all of the developers on your Apache Airflow development team. These users need access to the AmazonMWAAFullConsoleAccess, AmazonMWAAAirflowCliAccess, and AmazonMWAAWebServerAccess permission policies. This section describes how to create a group in IAM, create and attach these policies, and associate the group to an IAM user. The steps assume you're using an AWS owned key. To create the AmazonMWAAFullConsoleAccess policy 1. Download the AmazonMWAAFullConsoleAccess access policy. 2. Open the Policies page on the IAM console. Creating a JSON policy 60 Amazon Managed Workflows for Apache Airflow User Guide 3. Choose Create policy. 4. Choose the JSON tab. 5. Paste the JSON policy for AmazonMWAAFullConsoleAccess. 6. Substitute the following values: a. {your-account-id} – Your AWS account ID (such as 0123456789) b. {your-kms-id} – The unique identifer for a customer managed key, applicable only if you use a customer managed key for encryption at-rest. 7. Choose the Review policy. 8. Type AmazonMWAAFullConsoleAccess in Name. 9. Choose Create policy. To create the AmazonMWAAWebServerAccess policy 1. Download the AmazonMWAAWebServerAccess access policy. 2. Open the Policies page on the IAM console. 3. Choose Create policy. 4. Choose the JSON tab. 5. Paste the JSON policy for AmazonMWAAWebServerAccess. 6. Substitute the following values: a. {your-region} – the region of your Amazon
amazon-mwaa-user-guide-021
amazon-mwaa-user-guide.pdf
21
Substitute the following values: a. {your-account-id} – Your AWS account ID (such as 0123456789) b. {your-kms-id} – The unique identifer for a customer managed key, applicable only if you use a customer managed key for encryption at-rest. 7. Choose the Review policy. 8. Type AmazonMWAAFullConsoleAccess in Name. 9. Choose Create policy. To create the AmazonMWAAWebServerAccess policy 1. Download the AmazonMWAAWebServerAccess access policy. 2. Open the Policies page on the IAM console. 3. Choose Create policy. 4. Choose the JSON tab. 5. Paste the JSON policy for AmazonMWAAWebServerAccess. 6. Substitute the following values: a. {your-region} – the region of your Amazon MWAA environment (such as us-east-1) b. {your-account-id} – your AWS account ID (such as 0123456789) c. {your-environment-name} – your Amazon MWAA environment name (such as MyAirflowEnvironment) d. {airflow-role} – the Admin Apache Airflow Default Role 7. Choose Review policy. 8. Type AmazonMWAAWebServerAccess in Name. 9. Choose Create policy. To create the AmazonMWAAAirflowCliAccess policy 1. Download the AmazonMWAAAirflowCliAccess access policy. Example use case 61 Amazon Managed Workflows for Apache Airflow User Guide 2. Open the Policies page on the IAM console. 3. Choose Create policy. 4. Choose the JSON tab. 5. Paste the JSON policy for AmazonMWAAAirflowCliAccess. 6. Choose the Review policy. 7. Type AmazonMWAAAirflowCliAccess in Name. 8. Choose Create policy. To create the group 1. Open the Groups page on the IAM console. 2. Type a name of AirflowDevelopmentGroup. 3. Choose Next Step. 4. 5. Type AmazonMWAA to filter results in Filter. Select the three policies you created. 6. Choose Next Step. 7. Choose Create Group. To associate to a user 1. Open the Users page on the IAM console. 2. Choose a user. 3. Choose Groups. 4. Choose Add user to groups. 5. Select the AirflowDevelopmentGroup. 6. Choose Add to Groups. What's next? • Learn how to generate a token to access the Apache Airflow UI in Accessing Apache Airflow. • Learn more about creating IAM policies in Creating IAM policies. What's next? 62 Amazon Managed Workflows for Apache Airflow User Guide Service-linked role for Amazon MWAA Amazon Managed Workflows for Apache Airflow uses AWS Identity and Access Management (IAM) service-linked roles. A service-linked role is a unique type of IAM role that is linked directly to Amazon MWAA. Service-linked roles are predefined by Amazon MWAA and include all the permissions that the service requires to call other AWS services on your behalf. A service-linked role makes setting up Amazon MWAA easier because you don’t have to manually add the necessary permissions. Amazon MWAA defines the permissions of its service-linked roles, and unless defined otherwise, only Amazon MWAA can assume its roles. The defined permissions include the trust policy and the permissions policy, and that permissions policy cannot be attached to any other IAM entity. You can delete a service-linked role only after first deleting their related resources. This protects your Amazon MWAA resources because you can't inadvertently remove permission to access the resources. For information about other services that support service-linked roles, see AWS Services That Work with IAM and look for the services that have Yes in the Service-linked roles column. Choose a Yes with a link to view the service-linked role documentation for that service. Service-linked role permissions for Amazon MWAA Amazon MWAA uses the service-linked role named AWSServiceRoleForAmazonMWAA – The service-linked role created in your account grants Amazon MWAA access to the following AWS services: • Amazon CloudWatch Logs (CloudWatch Logs) – To create log groups for Apache Airflow logs. • Amazon CloudWatch (CloudWatch) – To publish metrics related to your environment and its underlying components to your account. • Amazon Elastic Compute Cloud (Amazon EC2) – To create the following resources: • An Amazon VPC endpoint in your VPC for an AWS-managed Amazon Aurora PostgreSQL database cluster to be used by the Apache Airflow Scheduler and Worker. • An additional Amazon VPC endpoint to enable network access to the Web server if you choose the private network option for your Apache Airflow Web server. • Elastic Network Interfaces (ENIs) in your Amazon VPC to enable network access to AWS resources hosted in your Amazon VPC. Service-linked role 63 Amazon Managed Workflows for Apache Airflow User Guide The following trust policy allows the service principal to assume the service-linked role. The service principal for Amazon MWAA is airflow.amazonaws.com as demonstrated by the policy. { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Principal": { "Service": "airflow.amazonaws.com" }, "Action": "sts:AssumeRole" } ] } The role permissions policy named AmazonMWAAServiceRolePolicy allows Amazon MWAA to complete the following actions on the specified resources: { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "logs:CreateLogStream", "logs:CreateLogGroup", "logs:DescribeLogGroups" ], "Resource": "arn:aws:logs:*:*:log-group:airflow-*:*" }, { "Effect": "Allow", "Action": [ "ec2:AttachNetworkInterface", "ec2:CreateNetworkInterface", "ec2:CreateNetworkInterfacePermission", "ec2:DeleteNetworkInterface", "ec2:DeleteNetworkInterfacePermission", "ec2:DescribeDhcpOptions", "ec2:DescribeNetworkInterfaces", "ec2:DescribeSecurityGroups", "ec2:DescribeSubnets", Service-linked role permissions for Amazon MWAA 64 Amazon Managed Workflows for Apache Airflow User Guide
amazon-mwaa-user-guide-022
amazon-mwaa-user-guide.pdf
22
to assume the service-linked role. The service principal for Amazon MWAA is airflow.amazonaws.com as demonstrated by the policy. { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Principal": { "Service": "airflow.amazonaws.com" }, "Action": "sts:AssumeRole" } ] } The role permissions policy named AmazonMWAAServiceRolePolicy allows Amazon MWAA to complete the following actions on the specified resources: { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "logs:CreateLogStream", "logs:CreateLogGroup", "logs:DescribeLogGroups" ], "Resource": "arn:aws:logs:*:*:log-group:airflow-*:*" }, { "Effect": "Allow", "Action": [ "ec2:AttachNetworkInterface", "ec2:CreateNetworkInterface", "ec2:CreateNetworkInterfacePermission", "ec2:DeleteNetworkInterface", "ec2:DeleteNetworkInterfacePermission", "ec2:DescribeDhcpOptions", "ec2:DescribeNetworkInterfaces", "ec2:DescribeSecurityGroups", "ec2:DescribeSubnets", Service-linked role permissions for Amazon MWAA 64 Amazon Managed Workflows for Apache Airflow User Guide "ec2:DescribeVpcEndpoints", "ec2:DescribeVpcs", "ec2:DetachNetworkInterface" ], "Resource": "*" }, { "Effect": "Allow", "Action": "ec2:CreateVpcEndpoint", "Resource": "arn:aws:ec2:*:*:vpc-endpoint/*", "Condition": { "ForAnyValue:StringEquals": { "aws:TagKeys": "AmazonMWAAManaged" } } }, { "Effect": "Allow", "Action": [ "ec2:ModifyVpcEndpoint", "ec2:DeleteVpcEndpoints" ], "Resource": "arn:aws:ec2:*:*:vpc-endpoint/*", "Condition": { "Null": { "aws:ResourceTag/AmazonMWAAManaged": false } } }, { "Effect": "Allow", "Action": [ "ec2:CreateVpcEndpoint", "ec2:ModifyVpcEndpoint" ], "Resource": [ "arn:aws:ec2:*:*:vpc/*", "arn:aws:ec2:*:*:security-group/*", "arn:aws:ec2:*:*:subnet/*" ] }, { "Effect": "Allow", "Action": "ec2:CreateTags", Service-linked role permissions for Amazon MWAA 65 Amazon Managed Workflows for Apache Airflow User Guide "Resource": "arn:aws:ec2:*:*:vpc-endpoint/*", "Condition": { "StringEquals": { "ec2:CreateAction": "CreateVpcEndpoint" }, "ForAnyValue:StringEquals": { "aws:TagKeys": "AmazonMWAAManaged" } } }, { "Effect": "Allow", "Action": "cloudwatch:PutMetricData", "Resource": "*", "Condition": { "StringEquals": { "cloudwatch:namespace": [ "AWS/MWAA" ] } } } ] } You must configure permissions to allow an IAM entity (such as a user, group, or role) to create, edit, or delete a service-linked role. For more information, see Service-linked role permissions in the IAM User Guide. Creating a service-linked role for Amazon MWAA You don't need to manually create a service-linked role. When you create a new Amazon MWAA environment using the AWS Management Console, the AWS CLI, or the AWS API, Amazon MWAA creates the service-linked role for you. If you delete this service-linked role, and then need to create it again, you can use the same process to recreate the role in your account. When you create another environment, Amazon MWAA creates the service-linked role for you again. Creating a service-linked role for Amazon MWAA 66 Amazon Managed Workflows for Apache Airflow User Guide Editing a service-linked role for Amazon MWAA Amazon MWAA does not allow you to edit the AWSServiceRoleForAmazonMWAA service-linked role. After you create a service-linked role, you cannot change the name of the role because various entities might reference the role. However, you can edit the description of the role using IAM. For more information, see Editing a service-linked role in the IAM User Guide. Deleting a service-linked role for Amazon MWAA If you no longer need to use a feature or service that requires a service-linked role, we recommend that you delete that role. That way you don’t have an unused entity that is not actively monitored or maintained. When you delete an Amazon MWAA environment, Amazon MWAA deletes all the associated resources it uses as a part of the service. However, you must wait before Amazon MWAA completes deleting your environment, before attempting to delete the service-linked role. If you delete the service-linked role before Amazon MWAA deletes the environment, Amazon MWAA might be unable to delete all of the environment's associated resources. To manually delete the service-linked role using IAM Use the IAM console, the AWS CLI, or the AWS API to delete the AWSServiceRoleForAmazonMWAA service-linked role. For more information, see Deleting a service-linked role in the IAM User Guide. Supported regions for Amazon MWAA service-linked roles Amazon MWAA supports using service-linked roles in all of the regions where the service is available. For more information, see Amazon Managed Workflows for Apache Airflow endpoints and quotas. Policy updates Change Description Date Amazon MWAA update its service-linked role permission policy AmazonMWAAServiceR olePolicy – Amazon MWAA updates the permissio n policy for its service-l inked role to grant Amazon November 18, 2022 Editing a service-linked role for Amazon MWAA 67 Amazon Managed Workflows for Apache Airflow User Guide Change Description Date MWAA permission to publish additional metrics related to the service's underlyin g resources to customer accounts. These new metrics are published under the AWS/ MWAA Amazon MWAA started tracking changes for its AWS managed service-linked role permission policy. November 18, 2022 Amazon MWAA started tracking changes Amazon MWAA execution role An execution role is an AWS Identity and Access Management (IAM) role with a permissions policy that grants Amazon Managed Workflows for Apache Airflow permission to invoke the resources of other AWS services on your behalf. This can include resources such as your Amazon S3 bucket, AWS owned key, and CloudWatch Logs. Amazon MWAA environments need one execution role per environment. This topic describes how to use and configure the execution role for your environment to allow Amazon MWAA to access other AWS resources used by your environment. Contents • Execution role overview • Permissions attached by default • How
amazon-mwaa-user-guide-023
amazon-mwaa-user-guide.pdf
23
An execution role is an AWS Identity and Access Management (IAM) role with a permissions policy that grants Amazon Managed Workflows for Apache Airflow permission to invoke the resources of other AWS services on your behalf. This can include resources such as your Amazon S3 bucket, AWS owned key, and CloudWatch Logs. Amazon MWAA environments need one execution role per environment. This topic describes how to use and configure the execution role for your environment to allow Amazon MWAA to access other AWS resources used by your environment. Contents • Execution role overview • Permissions attached by default • How to add permission to use other AWS services • How to associate a new execution role • Create a new role • View and update an execution role policy • Attach a JSON policy to use other AWS services • Grant access to Amazon S3 bucket with account-level public access block • Use Apache Airflow connections • Sample JSON policies for an execution role Execution role 68 Amazon Managed Workflows for Apache Airflow User Guide • Sample policy for a customer managed key • Sample policy for an AWS owned key • What's next? Execution role overview Permission for Amazon MWAA to use other AWS services used by your environment are obtained from the execution role. An Amazon MWAA execution role needs permission to the following AWS services used by an environment: • Amazon CloudWatch (CloudWatch) – to send Apache Airflow metrics and logs. • Amazon Simple Storage Service (Amazon S3) – to parse your environment's DAG code and supporting files (such as a requirements.txt). • Amazon Simple Queue Service (Amazon SQS) – to queue your environment's Apache Airflow tasks in an Amazon SQS queue owned by Amazon MWAA. • AWS Key Management Service (AWS KMS) – for your environment's data encryption (using either an AWS owned key or your Customer managed key). Note If you have elected for Amazon MWAA to use an AWS owned KMS key to encrypt your data, then you must define permissions in a policy attached to your Amazon MWAA execution role that grant access to arbitrary KMS keys stored outside of your account via Amazon SQS. The following two conditions are required in order for your environment's execution role to access arbitrary KMS keys: • A KMS key in a third-party account needs to allow this cross account access via its resource policy. • Your DAG code needs to access an Amazon SQS queue that starts with airflow- celery- in the third-party account and uses the same KMS key for encryption. In order to mitigate the risks associated with cross-account access to resources, we recommend reviewing the code placed in your DAGs to ensure that your workflows are not accessing arbitrary Amazon SQS queues outside your account. Furthermore, you can use a customer managed KMS key stored in your own account to manage encryption on Amazon MWAA. This limits your environment's execution role to access only the KMS key in your account. Execution role overview 69 Amazon Managed Workflows for Apache Airflow User Guide Keep in mind that after you choose an encryption option, you cannot change your selection for an existing environment. An execution role also needs permission to the following IAM actions: • airflow:PublishMetrics – to allow Amazon MWAA to monitor the health of an environment. Permissions attached by default You can use the default options on the Amazon MWAA console to create an execution role and an AWS owned key, then use the steps on this page to add permission policies to your execution role. • When you choose the Create new role option on the console, Amazon MWAA attaches the minimal permissions needed by an environment to your execution role. • In some cases, Amazon MWAA attaches the maximum permissions. For example, we recommend choosing the option on the Amazon MWAA console to create an execution role when you create an environment. Amazon MWAA adds the permissions policies for all CloudWatch Logs groups automatically by using the regex pattern in the execution role as "arn:aws:logs:your- region:your-account-id:log-group:airflow-your-environment-name-*". How to add permission to use other AWS services Amazon MWAA can't add or edit permission policies to an existing execution role after an environment is created. You must update your execution role with additional permission policies needed by your environment. For example, if your DAG requires access to AWS Glue, Amazon MWAA can't automatically detect these permissions are required by your environment, or add the permissions to your execution role. You can add permissions to an execution role in two ways: • By modifying the JSON policy for your execution role inline. You can use the sample JSON policy documents on this page to either add to or replace the JSON policy of your execution role on the IAM console. Execution role overview
amazon-mwaa-user-guide-024
amazon-mwaa-user-guide.pdf
24
environment is created. You must update your execution role with additional permission policies needed by your environment. For example, if your DAG requires access to AWS Glue, Amazon MWAA can't automatically detect these permissions are required by your environment, or add the permissions to your execution role. You can add permissions to an execution role in two ways: • By modifying the JSON policy for your execution role inline. You can use the sample JSON policy documents on this page to either add to or replace the JSON policy of your execution role on the IAM console. Execution role overview 70 Amazon Managed Workflows for Apache Airflow User Guide • By creating a JSON policy for an AWS service and attaching it to your execution role. You can use the steps on this page to associate a new JSON policy document for an AWS service to your execution role on the IAM console. Assuming the execution role is already associated to your environment, Amazon MWAA can start using the added permission policies immediately. This also means if you remove any required permissions from an execution role, your DAGs may fail. How to associate a new execution role You can change the execution role for your environment at any time. If a new execution role is not already associated with your environment, use the steps on this page to create a new execution role policy, and associate the role to your environment. Create a new role By default, Amazon MWAA creates an AWS owned key for data encryption and an execution role on your behalf. You can choose the default options on the Amazon MWAA console when you create an environment. The following image shows the default option to create an execution role for an environment. View and update an execution role policy You can view the execution role for your environment on the Amazon MWAA console, and update the JSON policy for the role on the IAM console. Create a new role 71 Amazon Managed Workflows for Apache Airflow User Guide To update an execution role policy 1. Open the Environments page on the Amazon MWAA console. 2. Choose an environment. 3. Choose the execution role on the Permissions pane to open the permissions page in IAM. 4. Choose the execution role name to open the permissions policy. 5. Choose Edit policy. 6. Choose the JSON tab. 7. Update your JSON policy. 8. Choose Review policy. 9. Choose Save changes. Attach a JSON policy to use other AWS services You can create a JSON policy for an AWS service and attach it to your execution role. For example, you can attach the following JSON policy to grant read-only access to all resources in AWS Secrets Manager. { "Version":"2012-10-17", "Statement":[ { "Effect":"Allow", "Action":[ "secretsmanager:GetResourcePolicy", "secretsmanager:GetSecretValue", "secretsmanager:DescribeSecret", "secretsmanager:ListSecretVersionIds" ], "Resource":[ "*" ] } ] } View and update an execution role policy 72 Amazon Managed Workflows for Apache Airflow User Guide To attach a policy to your execution role 1. Open the Environments page on the Amazon MWAA console. 2. Choose an environment. 3. Choose your execution role on the Permissions pane. 4. Choose Attach policies. 5. Choose Create policy. 6. Choose JSON. 7. Paste the JSON policy. 8. Choose Next: Tags, Next: Review. 9. Enter a descriptive name (such as SecretsManagerReadPolicy) and a description for the policy. 10. Choose Create policy. Grant access to Amazon S3 bucket with account-level public access block You might want to block access to all buckets in your account by using the PutPublicAccessBlock Amazon S3 operation. When you block access to all buckets in your account, your environment execution role must include the s3:GetAccountPublicAccessBlock action in a permission policy. The following example demonstrates the policy you must attach to your execution role when blocking access to all Amazon S3 buckets in your account. { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": "s3:GetAccountPublicAccessBlock", "Resource": "*" } ] } Grant access to Amazon S3 bucket with account-level public access block 73 Amazon Managed Workflows for Apache Airflow User Guide For more information about restricting access to your Amazon S3 buckets, see Blocking public access to your Amazon S3 storage in the Amazon Simple Storage Service User Guide. Use Apache Airflow connections You can also create an Apache Airflow connection and specify your execution role and its ARN in your Apache Airflow connection object. To learn more, see Managing connections to Apache Airflow. Sample JSON policies for an execution role The sample permission policies in this section show two policies you can use to replace the permissions policy used for your existing execution role, or to create a new execution role and use for your environment. These policies contain Resource ARN placeholders for Apache Airflow log groups, an Amazon S3 bucket, and an Amazon MWAA environment. We recommend
amazon-mwaa-user-guide-025
amazon-mwaa-user-guide.pdf
25
Use Apache Airflow connections You can also create an Apache Airflow connection and specify your execution role and its ARN in your Apache Airflow connection object. To learn more, see Managing connections to Apache Airflow. Sample JSON policies for an execution role The sample permission policies in this section show two policies you can use to replace the permissions policy used for your existing execution role, or to create a new execution role and use for your environment. These policies contain Resource ARN placeholders for Apache Airflow log groups, an Amazon S3 bucket, and an Amazon MWAA environment. We recommend copying the example policy, replacing the sample ARNs or placeholders, then using the JSON policy to create or update an execution role. For example, replacing {your-region} with us-east-1. Sample policy for a customer managed key The following example shows an execution role policy you can use for an Customer managed key. { "Version": "2012-10-17", "Statement": [ { "Effect": "Deny", "Action": "s3:ListAllMyBuckets", "Resource": [ "arn:aws:s3:::{your-s3-bucket-name}", "arn:aws:s3:::{your-s3-bucket-name}/*" ] }, { "Effect": "Allow", "Action": [ "s3:GetObject*", "s3:GetBucket*", "s3:List*" ], Use Apache Airflow connections 74 Amazon Managed Workflows for Apache Airflow "Resource": [ "arn:aws:s3:::{your-s3-bucket-name}", "arn:aws:s3:::{your-s3-bucket-name}/*" User Guide ] }, { "Effect": "Allow", "Action": [ "logs:CreateLogStream", "logs:CreateLogGroup", "logs:PutLogEvents", "logs:GetLogEvents", "logs:GetLogRecord", "logs:GetLogGroupFields", "logs:GetQueryResults" ], "Resource": [ "arn:aws:logs:{your-region}:{your-account-id}:log-group:airflow-{your- environment-name}-*" ] }, { "Effect": "Allow", "Action": [ "logs:DescribeLogGroups" ], "Resource": [ "*" ] }, { "Effect": "Allow", "Action": [ "s3:GetAccountPublicAccessBlock" ], "Resource": [ "*" ] }, { "Effect": "Allow", "Action": "cloudwatch:PutMetricData", "Resource": "*" }, Sample policies 75 Amazon Managed Workflows for Apache Airflow User Guide { "Effect": "Allow", "Action": [ "sqs:ChangeMessageVisibility", "sqs:DeleteMessage", "sqs:GetQueueAttributes", "sqs:GetQueueUrl", "sqs:ReceiveMessage", "sqs:SendMessage" ], "Resource": "arn:aws:sqs:{your-region}:*:airflow-celery-*" }, { "Effect": "Allow", "Action": [ "kms:Decrypt", "kms:DescribeKey", "kms:GenerateDataKey*", "kms:Encrypt" ], "Resource": "arn:aws:kms:{your-region}:{your-account-id}:key/{your-kms-cmk- id}", "Condition": { "StringLike": { "kms:ViaService": [ "sqs.{your-region}.amazonaws.com", "s3.{your-region}.amazonaws.com" ] } } } ] } Next, you need to allow Amazon MWAA to assume this role in order to perform actions on your behalf. This can be done by adding "airflow.amazonaws.com" and "airflow- env.amazonaws.com" service principals to the list of trusted entities for this execution role using the IAM console, or by placing these service principals in the assume role policy document for this execution role via the IAM create-role command using the AWS CLI. A sample assume role policy document can be found below: { Sample policies 76 Amazon Managed Workflows for Apache Airflow User Guide "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Principal": { "Service": ["airflow.amazonaws.com","airflow-env.amazonaws.com"] }, "Action": "sts:AssumeRole" } ] } Then attach the following JSON policy to your Customer managed key. This policy uses the kms:EncryptionContext condition key prefix to permit access to your Apache Airflow logs group in CloudWatch Logs. { "Sid": "Allow logs access", "Effect": "Allow", "Principal": { "Service": "logs.{your-region}.amazonaws.com" }, "Action": [ "kms:Encrypt*", "kms:Decrypt*", "kms:ReEncrypt*", "kms:GenerateDataKey*", "kms:Describe*" ], "Resource": "*", "Condition": { "ArnLike": { "kms:EncryptionContext:aws:logs:arn": "arn:aws:logs:{your-region}:{your- account-id}:*" } } } Sample policy for an AWS owned key The following example shows an execution role policy you can use for an AWS owned key. Sample policies 77 Amazon Managed Workflows for Apache Airflow User Guide { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": "airflow:PublishMetrics", "Resource": "arn:aws:airflow:{your-region}:{your-account-id}:environment/ {your-environment-name}" }, { "Effect": "Deny", "Action": "s3:ListAllMyBuckets", "Resource": [ "arn:aws:s3:::{your-s3-bucket-name}", "arn:aws:s3:::{your-s3-bucket-name}/*" ] }, { "Effect": "Allow", "Action": [ "s3:GetObject*", "s3:GetBucket*", "s3:List*" ], "Resource": [ "arn:aws:s3:::{your-s3-bucket-name}", "arn:aws:s3:::{your-s3-bucket-name}/*" ] }, { "Effect": "Allow", "Action": [ "logs:CreateLogStream", "logs:CreateLogGroup", "logs:PutLogEvents", "logs:GetLogEvents", "logs:GetLogRecord", "logs:GetLogGroupFields", "logs:GetQueryResults" ], "Resource": [ "arn:aws:logs:{your-region}:{your-account-id}:log-group:airflow-{your- environment-name}-*" Sample policies 78 Amazon Managed Workflows for Apache Airflow User Guide ] }, { "Effect": "Allow", "Action": [ "logs:DescribeLogGroups" ], "Resource": [ "*" ] }, { "Effect": "Allow", "Action": [ "s3:GetAccountPublicAccessBlock" ], "Resource": [ "*" ] }, { "Effect": "Allow", "Action": "cloudwatch:PutMetricData", "Resource": "*" }, { "Effect": "Allow", "Action": [ "sqs:ChangeMessageVisibility", "sqs:DeleteMessage", "sqs:GetQueueAttributes", "sqs:GetQueueUrl", "sqs:ReceiveMessage", "sqs:SendMessage" ], "Resource": "arn:aws:sqs:{your-region}:*:airflow-celery-*" }, { "Effect": "Allow", "Action": [ "kms:Decrypt", "kms:DescribeKey", "kms:GenerateDataKey*", "kms:Encrypt" Sample policies 79 Amazon Managed Workflows for Apache Airflow User Guide ], "NotResource": "arn:aws:kms:*:{your-account-id}:key/*", "Condition": { "StringLike": { "kms:ViaService": [ "sqs.{your-region}.amazonaws.com" ] } } } ] } What's next? • Learn about the required permissions you and your Apache Airflow users need to access your environment in Accessing an Amazon MWAA environment. • Learn about Using customer managed keys for encryption. • Explore more Customer managed policy examples. Cross-service confused deputy prevention The confused deputy problem is a security issue where an entity that doesn't have permission to perform an action can coerce a more-privileged entity to perform the action. In AWS, cross-service impersonation can result in the confused deputy problem. Cross-service impersonation can occur when one service (the calling service) calls another service (the called service). The calling service can be manipulated to use its permissions to act on another customer's resources in a way it should not otherwise have permission to access. To prevent this, AWS provides tools that help you protect your data for all services
amazon-mwaa-user-guide-026
amazon-mwaa-user-guide.pdf
26
confused deputy prevention The confused deputy problem is a security issue where an entity that doesn't have permission to perform an action can coerce a more-privileged entity to perform the action. In AWS, cross-service impersonation can result in the confused deputy problem. Cross-service impersonation can occur when one service (the calling service) calls another service (the called service). The calling service can be manipulated to use its permissions to act on another customer's resources in a way it should not otherwise have permission to access. To prevent this, AWS provides tools that help you protect your data for all services with service principals that have been given access to resources in your account. We recommend using the aws:SourceArn and aws:SourceAccount global condition context keys in your environment' execution role to limit the permissions that Amazon MWAA gives another service to access the resource. Use aws:SourceArn if you want only one resource to be associated with the cross-service access. Use aws:SourceAccount if you want to allow any resource in that account to be associated with the cross-service use. The most effective way to protect against the confused deputy problem is to use the aws:SourceArn global condition context key with the full ARN of the resource. If you don't know What's next? 80 Amazon Managed Workflows for Apache Airflow User Guide the full ARN of the resource or if you are specifying multiple resources, use the aws:SourceArn global context condition key with wildcard characters (*) for the unknown portions of the ARN. For example, arn:aws:airflow:*:123456789012:environment/*. The value of aws:SourceArn must be your Amazon MWAA environment ARN, for which you are creating an execution role. The following example shows how you can use the aws:SourceArn and aws:SourceAccount global condition context keys in your environment's execution role trust policy to prevent the confused deputy problem. You can use the following trust policy when you create a new execution role. { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Principal": { "Service": ["airflow.amazonaws.com","airflow-env.amazonaws.com"] }, "Action": "sts:AssumeRole", "Condition":{ "ArnLike":{ "aws:SourceArn":"arn:aws:airflow:your- region:123456789012:environment/your-environment-name" }, "StringEquals":{ "aws:SourceAccount":"123456789012" } } } ] } Apache Airflow access modes The Amazon Managed Workflows for Apache Airflow console contains built-in options to configure private or public routing to the Apache Airflow web server on your environment. This guide describes the access modes available for the Apache Airflow Web server on your Amazon Managed Workflows for Apache Airflow environment, and the additional resources you'll need to configure in your Amazon VPC if you choose the private network option. Apache Airflow access modes 81 Amazon Managed Workflows for Apache Airflow User Guide Contents • Apache Airflow access modes • Public network • Private network • Access modes overview • Public network access mode • Private network access mode • Setup for private and public access modes • Setup for public network • Setup for private network • Accessing the VPC endpoint for your Apache Airflow Web server (private network access) Apache Airflow access modes You can choose private or public routing for your Apache Airflow Web server. To enable private routing, choose Private network. This limits user access to an Apache Airflow Web server to within an Amazon VPC. To enable public routing, choose Public network. This allows users to access the Apache Airflow Web server over the Internet. Public network The following architectural diagram shows an Amazon MWAA environment with a public web server. Apache Airflow access modes 82 Amazon Managed Workflows for Apache Airflow User Guide The public network access mode allows the Apache Airflow UI to be accessed over the internet by users granted access to the IAM policy for your environment. The following image shows where to find the Public network option on the Amazon MWAA console. Private network The following architectural diagram shows an Amazon MWAA environment with a private web server. Apache Airflow access modes 83 Amazon Managed Workflows for Apache Airflow User Guide The private network access mode limits access to the Apache Airflow UI to users within your Amazon VPC that have been granted access to the IAM policy for your environment. When you create an environment with private web server access, you must package all of your dependencies in a Python wheel archive (.whl), then reference the .whl in your requirements.txt. For instructions on packaging and installing your dependencies using wheel, see Managing dependencies using Python wheel. The following image shows where to find the Private network option on the Amazon MWAA console. Access modes overview This section describes the VPC endpoints (AWS PrivateLink) created in your Amazon VPC when you choose the Public network or Private network access mode. Access modes overview 84 Amazon Managed Workflows for Apache Airflow User Guide Public network access mode If you chose the Public network access mode for your Apache Airflow Web server, network traffic is publicly routed over the
amazon-mwaa-user-guide-027
amazon-mwaa-user-guide.pdf
27
.whl in your requirements.txt. For instructions on packaging and installing your dependencies using wheel, see Managing dependencies using Python wheel. The following image shows where to find the Private network option on the Amazon MWAA console. Access modes overview This section describes the VPC endpoints (AWS PrivateLink) created in your Amazon VPC when you choose the Public network or Private network access mode. Access modes overview 84 Amazon Managed Workflows for Apache Airflow User Guide Public network access mode If you chose the Public network access mode for your Apache Airflow Web server, network traffic is publicly routed over the Internet. • Amazon MWAA creates a VPC interface endpoint for your Amazon Aurora PostgreSQL metadata database. The endpoint is created in the Availability Zones mapped to your private subnets and is independent from other AWS accounts. • Amazon MWAA then binds an IP address from your private subnets to the interface endpoints. This is designed to support the best practice of binding a single IP from each Availability Zone of the Amazon VPC. Private network access mode If you chose the Private network access mode for your Apache Airflow Web server, network traffic is privately routed within your Amazon VPC. • Amazon MWAA creates a VPC interface endpoint for your Apache Airflow Web server, and an interface endpoint for your Amazon Aurora PostgreSQL metadata database. The endpoints are created in the Availability Zones mapped to your private subnets and is independent from other AWS accounts. • Amazon MWAA then binds an IP address from your private subnets to the interface endpoints. This is designed to support the best practice of binding a single IP from each Availability Zone of the Amazon VPC. To learn more, see the section called “Example use cases for an Amazon VPC and Apache Airflow access mode”. Setup for private and public access modes The following section describes the additional setup and configurations you'll need based on the Apache Airflow access mode you've chosen for your environment. Setup for public network If you choose the Public network option for your Apache Airflow Web server, you can begin using the Apache Airflow UI after you create your environment. Setup for private and public access modes 85 Amazon Managed Workflows for Apache Airflow User Guide You'll need to take the following steps to configure access for your users, and permission for your environment to use other AWS services. 1. Add permissions. Amazon MWAA needs permission to use other AWS services. When you create an environment, Amazon MWAA creates a service-linked role that allows it to use certain IAM actions for Amazon Elastic Container Registry (Amazon ECR), CloudWatch Logs, and Amazon EC2. You can add permission to use additional actions for these services, or to use other AWS services by adding permissions to your execution role. To learn more, see Amazon MWAA execution role. 2. Create user policies. You may need to create multiple IAM policies for your users to configure access to your environment and Apache Airflow UI. To learn more, see Accessing an Amazon MWAA environment. Setup for private network If you choose the Private network option for your Apache Airflow Web server, you'll need to configure access for your users, permission for your environment to use other AWS services, and create a mechanism to access the resources in your Amazon VPC from your computer. 1. Add permissions. Amazon MWAA needs permission to use other AWS services. When you create an environment, Amazon MWAA creates a service-linked role that allows it to use certain IAM actions for Amazon Elastic Container Registry (Amazon ECR), CloudWatch Logs, and Amazon EC2. You can add permission to use additional actions for these services, or to use other AWS services by adding permissions to your execution role. To learn more, see Amazon MWAA execution role. 2. Create user policies. You may need to create multiple IAM policies for your users to configure access to your environment and Apache Airflow UI. To learn more, see Accessing an Amazon MWAA environment. 3. Enable network access. You'll need to create a mechanism in your Amazon VPC to connect to the VPC endpoint (AWS PrivateLink) for your Apache Airflow Web server. For example, by creating a VPN tunnel from your computer using an AWS Client VPN. Setup for private and public access modes 86 Amazon Managed Workflows for Apache Airflow User Guide Accessing the VPC endpoint for your Apache Airflow Web server (private network access) If you've chosen the Private network option, you'll need to create a mechanism in your Amazon VPC to access the VPC endpoint (AWS PrivateLink) for your Apache Airflow Web server. We recommend using the same Amazon VPC, VPC security group, and private subnets as your Amazon MWAA environment for these resources. To learn more, see Managing access for VPC endpoints. Accessing
amazon-mwaa-user-guide-028
amazon-mwaa-user-guide.pdf
28
VPN tunnel from your computer using an AWS Client VPN. Setup for private and public access modes 86 Amazon Managed Workflows for Apache Airflow User Guide Accessing the VPC endpoint for your Apache Airflow Web server (private network access) If you've chosen the Private network option, you'll need to create a mechanism in your Amazon VPC to access the VPC endpoint (AWS PrivateLink) for your Apache Airflow Web server. We recommend using the same Amazon VPC, VPC security group, and private subnets as your Amazon MWAA environment for these resources. To learn more, see Managing access for VPC endpoints. Accessing the VPC endpoint for your Apache Airflow Web server (private network access) 87 Amazon Managed Workflows for Apache Airflow User Guide Accessing Apache Airflow Amazon MWAA let's you access your Apache Airflow environment using multiple methods: the Apache Airflow user interface (UI) console, the Apache Airflow CLI, and the Apache Airflow REST API. You can use the Amazon MWAA console to view and invoke a DAG in your Apache Airflow UI, or use Amazon MWAA APIs to get a token and invoke a DAG. This section describes the permissions needed to access the Apache Airflow UI, how to generate a token to make Amazon MWAA API calls directly in your command shell, and the supported commands in the Apache Airflow CLI. Topics • Prerequisites • Open the Apache Airflow UI • Logging into Apache Airflow • Create a Apache Airflow web server access token • Setting up a custom domain for the Apache Airflow web server • Creating an Apache Airflow CLI token • Using the Apache Airflow REST API • Apache Airflow CLI command reference Prerequisites The following section describes the preliminary steps required to use the commands and scripts in this section. Access • AWS account access in AWS Identity and Access Management (IAM) to the Amazon MWAA permissions policy in Apache Airflow UI access policy: AmazonMWAAWebServerAccess. • AWS account access in AWS Identity and Access Management (IAM) to the Amazon MWAA permissions policy Full API and console access policy: AmazonMWAAFullApiAccess. Prerequisites 88 Amazon Managed Workflows for Apache Airflow User Guide AWS CLI The AWS Command Line Interface (AWS CLI) is an open source tool that enables you to interact with AWS services using commands in your command-line shell. To complete the steps on this page, you need the following: • AWS CLI – Install version 2. • AWS CLI – Quick configuration with aws configure. Open the Apache Airflow UI The following image shows the link to your Apache Airflow UI on the Amazon MWAA console. Logging into Apache Airflow You need Apache Airflow UI access policy: AmazonMWAAWebServerAccess permissions for your AWS account in AWS Identity and Access Management (IAM) to view your Apache Airflow UI. To access your Apache Airflow UI 1. Open the Environments page on the Amazon MWAA console. 2. Choose an environment. 3. Choose Open Airflow UI. Create a Apache Airflow web server access token You can use the commands on this page to create a web server access token. An access token allows you access to your Amazon MWAA environment. For example, you can get a token, then deploy DAGs programmatically using Amazon MWAA APIs. The following section includes the steps to create an Apache Airflow web login token using the AWS CLI, a bash script, a POST API request, or a Python script. The token returned in the response is valid for 60 seconds. AWS CLI 89 Amazon Managed Workflows for Apache Airflow User Guide Contents • Prerequisites • Access • AWS CLI • Using the AWS CLI • Using a bash script • Using a Python script • What's next? Prerequisites The following section describes the preliminary steps required to use the commands and scripts on this page. Access • AWS account access in AWS Identity and Access Management (IAM) to the Amazon MWAA permissions policy in Apache Airflow UI access policy: AmazonMWAAWebServerAccess. • AWS account access in AWS Identity and Access Management (IAM) to the Amazon MWAA permissions policy Full API and console access policy: AmazonMWAAFullApiAccess. AWS CLI The AWS Command Line Interface (AWS CLI) is an open source tool that enables you to interact with AWS services using commands in your command-line shell. To complete the steps on this page, you need the following: • AWS CLI – Install version 2. • AWS CLI – Quick configuration with aws configure. Using the AWS CLI The following example uses the create-web-login-token command in the AWS CLI to create an Apache Airflow web login token. Prerequisites 90 Amazon Managed Workflows for Apache Airflow User Guide aws mwaa create-web-login-token --name YOUR_ENVIRONMENT_NAME Using a bash script The following example uses a bash script to call the create-web-login-token command in the AWS CLI to create an Apache Airflow web login token. 1. Copy
amazon-mwaa-user-guide-029
amazon-mwaa-user-guide.pdf
29
commands in your command-line shell. To complete the steps on this page, you need the following: • AWS CLI – Install version 2. • AWS CLI – Quick configuration with aws configure. Using the AWS CLI The following example uses the create-web-login-token command in the AWS CLI to create an Apache Airflow web login token. Prerequisites 90 Amazon Managed Workflows for Apache Airflow User Guide aws mwaa create-web-login-token --name YOUR_ENVIRONMENT_NAME Using a bash script The following example uses a bash script to call the create-web-login-token command in the AWS CLI to create an Apache Airflow web login token. 1. Copy the contents of the following code sample and save locally as get-web-token.sh. #!/bin/bash HOST=YOUR_HOST_NAME YOUR_URL=https://$HOST/aws_mwaa/aws-console-sso?login=true# WEB_TOKEN=$(aws mwaa create-web-login-token --name YOUR_ENVIRONMENT_NAME --query WebToken --output text) echo $YOUR_URL$WEB_TOKEN 2. Substitute the placeholders in red for YOUR_HOST_NAME and YOUR_ENVIRONMENT_NAME. For example, a host name for a public network may look like this (without the https://): 123456a0-0101-2020-9e11-1b159eec9000.c2.us-east-1.airflow.amazonaws.com 3. (optional) macOS and Linux users may need to run the following command to ensure the script is executable. chmod +x get-web-token.sh 4. Run the following script to get a web login token. ./get-web-token.sh 5. You should see the following in your command prompt: https://123456a0-0101-2020-9e11-1b159eec9000.c2.us-east-1.airflow.amazonaws.com/ aws_mwaa/aws-console-sso?login=true#{your-web-login-token} Using a Python script The following example uses the boto3 create_web_login_token method in a Python script to create an Apache Airflow web login token. You can run this script outside of Amazon MWAA. The only Using a bash script 91 Amazon Managed Workflows for Apache Airflow User Guide thing you need to do is install the boto3 library. You may want to create a virtual environment to install the library. It assumes you have configured AWS authentication credentials for your account. 1. Copy the contents of the following code sample and save locally as create-web-login- token.py. import boto3 mwaa = boto3.client('mwaa') response = mwaa.create_web_login_token( Name="YOUR_ENVIRONMENT_NAME" ) webServerHostName = response["WebServerHostname"] webToken = response["WebToken"] airflowUIUrl = 'https://{0}/aws_mwaa/aws-console-sso? login=true#{1}'.format(webServerHostName, webToken) print("Here is your Airflow UI URL: ") print(airflowUIUrl) 2. Substitute the placeholder in red for YOUR_ENVIRONMENT_NAME. 3. Run the following script to get a web login token. python3 create-web-login-token.py What's next? • Explore the Amazon MWAA API operation used to create a web login token at CreateWebLoginToken. Setting up a custom domain for the Apache Airflow web server Amazon Managed Workflows for Apache Airflow (Amazon MWAA) lets you to set up a custom domain for the managed Apache Airflow web server. Using a custom domain, you can access your environment's Amazon MWAA managed Apache Airflow web server using the Apache Airflow UI, the Apache Airflow CLI, or the Apache Airflow web server. Note You can only use custom domain with a private web server without internet access. What's next? 92 Amazon Managed Workflows for Apache Airflow User Guide Use cases for a custom domain on Amazon MWAA 1. Share the web server domain across your cloud application on AWS — Using a custom domain lets you define a user-friendly URL for accessing the web server, instead of the generated service domain name. You can store this custom domain and share it as an environment variable in your applications. 2. Access a private web server — If you want to configure access for a web server in a VPC with no internet access, using a custom domain simplifies the URL redirection work flow. Topics • Configure the custom domain • Set up the networking infrastructure Configure the custom domain To configure the custom domain feature, you need to provide the custom domain value via the webserver.base_url Apache Airflow configuration when creating or updating your Amazon MWAA environment. The following constraints apply to your custom domain name: • The value should be a fully qualified domain name (FQDN) without any protocol or path. For example, your-custom-domain.com. • Amazon MWAA does not allow a path in the URL. For example, your-custom-domain.com/ dags/ is not a valid custom domain name. • The URL length is limited to 255 ASCII characters. • If you provide an empty string, by default, the environment will be created with a web server URL generated by Amazon MWAA. The following example shows using the AWS CLI to create an environment with a custom web server domain name. $ aws mwaa create-environment \ --name my-mwaa-env \ --source-bucket-arn arn:aws:s3:::my-bucket \ --airflow-configuration-options '{"webserver.base_url":"my-custom-domain.com"}' \ --network-configuration '{"SubnetIds":["subnet-0123456789abcdef","subnet- fedcba9876543210"]}' \ Configure the custom domain 93 Amazon Managed Workflows for Apache Airflow User Guide --execution-role-arn arn:aws:iam::123456789012:role/my-execution-role After the environment is created or updated, you need to set up the networking infrastructure in your AWS account to access the private web server via the custom domain. To revert back to the default service-generated URL, update your private environment and remove the webserver.base_url configuration option. Set up the networking infrastructure Use the following steps to set up the required networking infrastructure to use with your custom domain in your AWS account.
amazon-mwaa-user-guide-030
amazon-mwaa-user-guide.pdf
30
my-mwaa-env \ --source-bucket-arn arn:aws:s3:::my-bucket \ --airflow-configuration-options '{"webserver.base_url":"my-custom-domain.com"}' \ --network-configuration '{"SubnetIds":["subnet-0123456789abcdef","subnet- fedcba9876543210"]}' \ Configure the custom domain 93 Amazon Managed Workflows for Apache Airflow User Guide --execution-role-arn arn:aws:iam::123456789012:role/my-execution-role After the environment is created or updated, you need to set up the networking infrastructure in your AWS account to access the private web server via the custom domain. To revert back to the default service-generated URL, update your private environment and remove the webserver.base_url configuration option. Set up the networking infrastructure Use the following steps to set up the required networking infrastructure to use with your custom domain in your AWS account. 1. Get the IP addresses for the Amazon VPC Endpoint Network Interfaces (ENI). To do this, first, use get-environment to find the WebserverVpcEndpointService for your environment. $ aws mwaa get-environment --name your-environment-name If successful, you'll see output similar to the following. { "Environment": { "AirflowConfigurationOptions": {}, "AirflowVersion": "latest-version", "Arn": "environment-arn", "CreatedAt": "2024-06-01T01:00:00-00:00", "DagS3Path": "dags", . . . "WebserverVpcEndpointService": "web-server-vpc-endpoint-service", "WeeklyMaintenanceWindowStart": "TUE:21:30" } } Note the WebserverVpcEndpointService value and use it for web-server-vpc- endpoint-service in the following Amazon EC2 describe-vpc-endpoints command. -- filters Name=service-name,Values=web-server-vpc-endpoint-service-id in the following command. Set up the networking infrastructure 94 Amazon Managed Workflows for Apache Airflow User Guide 2. Retrieve the Amazon VPC endpoint details. This command fetches details about Amazon VPC endpoints that match a specific service name, returning the endpoint ID and associated network interface IDs in a text format. $ aws ec2 describe-vpc-endpoints \ --filters Name=service-name,Values=web-server-vpc-endpoint-service \ --query 'VpcEndpoints[*]. {EndpointId:VpcEndpointId,NetworkInterfaceIds:NetworkInterfaceIds}' \ --output text 3. Get the network interface details. This command retrieves private IP addresses for each network interface associated with the Amazon VPC endpoints identified in the previous step. $ for eni_id in $( aws ec2 describe-vpc-endpoints \ --filters Name=service-name,Values=service-id \ --query 'VpcEndpoints[*].NetworkInterfaceIds' \ --output text ); do aws ec2 describe-network-interfaces \ --network-interface-ids $eni_id \ --query 'NetworkInterfaces[*].PrivateIpAddresses[*].PrivateIpAddress' \ --output text done 4. Use create-target-group to create a new target group. You will use this target group to register the IP addresses for your web server Amazon VPC endpoints. $ aws elbv2 create-target-group \ --name new-target-group-namne \ --protocol HTTPS \ --port 443 \ --vpc-id web-server-vpc-id \ --target-type ip \ --health-check-protocol HTTPS \ --health-check-port 443 \ --health-check-path / \ --health-check-enabled \ --matcher 'HttpCode="200,302"' Register the IP addresses using the register-targets command. Set up the networking infrastructure 95 Amazon Managed Workflows for Apache Airflow User Guide $ aws elbv2 register-targets \ --target-group-arn target-group-arn \ --targets Id=ip-address-1 Id=ip-address-2 5. Request an ACM certificate. Skip this step if you are using an existing certificate. $ aws acm request-certificate \ --domain-name my-custom-domain.com \ --validation-method DNS 6. Configure an Application Load Balancer. First, create the load balancer, then create a listener for the load balancer. Specify the ACM certificate you created in the previous step. $ aws elbv2 create-load-balancer \ --name my-mwaa-lb \ --type application \ --subnets subnet-id-1 subnet-id-2 $ aws elbv2 create-listener \ --load-balancer-arn load-balancer-arn \ --protocol HTTPS \ --port 443 \ --ssl-policy ELBSecurityPolicy-2016-08 \ --certificates CertificateArn=acm-certificate-arn \ --default-actions Type=forward,TargetGroupArn=target-group-arn If you use a Network Load Balancer in a private subnet, set up a bastion host or AWS VPN tunnel to access the web server. 7. Create a hosted zone using Route 53 for the domain. $ aws route53 create-hosted-zone --name my-custom-domain.com \ --caller-reference 1 Create an A record for the domain. To do this using the AWS CLI, get the hosted zone ID using list-hosted-zones-by-name then apply the record with change-resource-record- sets. $ HOSTED_ZONE_ID=$(aws route53 list-hosted-zones-by-name \ --dns-name my-custom-domain.com \ Set up the networking infrastructure 96 Amazon Managed Workflows for Apache Airflow User Guide --query 'HostedZones[0].Id' --output text) $ aws route53 change-resource-record-sets \ --hosted-zone-id $HOSTED_ZONE_ID \ --change-batch '{ "Changes": [ { "Action": "CREATE", "ResourceRecordSet": { "Name": "my-custom-domain.com", "Type": "A", "AliasTarget": { "HostedZoneId": "load-balancer-hosted-zone-id>", "DNSName": "load-balancer-dns-name", "EvaluateTargetHealth": true } } } ] }' 8. Update the security group rules for the web server Amazon VPC endpoint to follow the principle of least privilege by allowing HTTPS traffic only from the public subnets where the Application Load Balancer is located. Save the following JSON locally. For example, as sg-ingress-ip- permissions.json. [ { "IpProtocol": "tcp", "FromPort": 443, "ToPort": 443, "UserIdGroupPairs": [ { "GroupId": "load-balancer-security-group-id" } ], "IpRanges": [ { "CidrIp": "public-subnet-1-cidr" }, { "CidrIp": "public-subnet-2-cidr" Set up the networking infrastructure 97 Amazon Managed Workflows for Apache Airflow User Guide } ] } ] Run the following Amazon EC2 command to update your ingress security group rules. Specify the JSON file for --ip-permissions. $ aws ec2 authorize-security-group-ingress \ --group-id <security-group-id> \ --ip-permissions file://sg-ingress-ip-permissions.json Run the following Amazon EC2 command to update your egress rules. $ aws ec2 authorize-security-group-egress \ --group-id webserver-vpc-endpoint-security-group-id \ --protocol tcp \ --port 443 \ --source-group load-balancer-security-group-id Open the Amazon MWAA console and navigate to the Apache Airflow UI. If you are setting up an Network Load Balancer in a private subnet instead of the Application Load Balancer used
amazon-mwaa-user-guide-031
amazon-mwaa-user-guide.pdf
31
Amazon Managed Workflows for Apache Airflow User Guide } ] } ] Run the following Amazon EC2 command to update your ingress security group rules. Specify the JSON file for --ip-permissions. $ aws ec2 authorize-security-group-ingress \ --group-id <security-group-id> \ --ip-permissions file://sg-ingress-ip-permissions.json Run the following Amazon EC2 command to update your egress rules. $ aws ec2 authorize-security-group-egress \ --group-id webserver-vpc-endpoint-security-group-id \ --protocol tcp \ --port 443 \ --source-group load-balancer-security-group-id Open the Amazon MWAA console and navigate to the Apache Airflow UI. If you are setting up an Network Load Balancer in a private subnet instead of the Application Load Balancer used here, you must access the web server with one of the following options. • the section called “Tutorial: Linux Bastion Host” • the section called “Tutorial: AWS Client VPN” Creating an Apache Airflow CLI token You can use the commands on this page to generate a CLI token, and then make Amazon Managed Workflows for Apache Airflow API calls directly in your command shell. For example, you can get a token, then deploy DAGs programmatically using Amazon MWAA APIs. The following section includes the steps to create an Apache Airflow CLI token using the AWS CLI, a curl script, a Python script, or a bash script. The token returned in the response is valid for 60 seconds. Apache Airflow CLI token 98 Amazon Managed Workflows for Apache Airflow User Guide Note The AWS CLI token is intended as a replacement for synchronous shell actions, not asynchronous API commands. As such, available concurrency is limited. To ensure that the web server remains responsive for users, it is recommended not to open a new AWS CLI request until the previous one completes successfully. Contents • Prerequisites • Access • AWS CLI • Using the AWS CLI • Using a curl script • Using a bash script • Using a Python script • What's next? Prerequisites The following section describes the preliminary steps required to use the commands and scripts on this page. Access • AWS account access in AWS Identity and Access Management (IAM) to the Amazon MWAA permissions policy in Apache Airflow UI access policy: AmazonMWAAWebServerAccess. • AWS account access in AWS Identity and Access Management (IAM) to the Amazon MWAA permissions policy Full API and console access policy: AmazonMWAAFullApiAccess. AWS CLI The AWS Command Line Interface (AWS CLI) is an open source tool that enables you to interact with AWS services using commands in your command-line shell. To complete the steps on this page, you need the following: Prerequisites 99 Amazon Managed Workflows for Apache Airflow • AWS CLI – Install version 2. • AWS CLI – Quick configuration with aws configure. Using the AWS CLI User Guide The following example uses the create-cli-token command in the AWS CLI to create an Apache Airflow CLI token. aws mwaa create-cli-token --name YOUR_ENVIRONMENT_NAME Using a curl script The following example uses a curl script to call the create-web-login-token command in the AWS CLI to invoke the Apache Airflow CLI via an endpoint on the Apache Airflow web server. Apache Airflow v2 1. Copy the curl statement from your text file and paste it in your command shell. Note After copying it to your clipboard, you may need to use Edit > Paste from your shell menu. CLI_JSON=$(aws mwaa --region YOUR_REGION create-cli-token -- name YOUR_ENVIRONMENT_NAME) \ && CLI_TOKEN=$(echo $CLI_JSON | jq -r '.CliToken') \ && WEB_SERVER_HOSTNAME=$(echo $CLI_JSON | jq -r '.WebServerHostname') \ && CLI_RESULTS=$(curl --request POST "https://$WEB_SERVER_HOSTNAME/aws_mwaa/ cli" \ --header "Authorization: Bearer $CLI_TOKEN" \ --header "Content-Type: text/plain" \ --data-raw "dags trigger YOUR_DAG_NAME") \ && echo "Output:" \ && echo $CLI_RESULTS | jq -r '.stdout' | base64 --decode \ && echo "Errors:" \ && echo $CLI_RESULTS | jq -r '.stderr' | base64 --decode Using the AWS CLI 100 Amazon Managed Workflows for Apache Airflow User Guide 2. Substitute the placeholders for YOUR_REGION with the AWS region for your environment, YOUR_DAG_NAME, and YOUR_ENVIRONMENT_NAME. For example, a host name for a public network may look like this (without the https://): 123456a0-0101-2020-9e11-1b159eec9000.c2.us-east-1.airflow.amazonaws.com 3. You should see the following in your command prompt: { "stderr":"<STDERR of the CLI execution (if any), base64 encoded>", "stdout":"<STDOUT of the CLI execution, base64 encoded>" } Apache Airflow v1 1. Copy the cURL statement from your text file and paste it in your command shell. Note After copying it to your clipboard, you may need to use Edit > Paste from your shell menu. CLI_JSON=$(aws mwaa --region YOUR_REGION create-cli-token -- name YOUR_ENVIRONMENT_NAME) \ && CLI_TOKEN=$(echo $CLI_JSON | jq -r '.CliToken') \ && WEB_SERVER_HOSTNAME=$(echo $CLI_JSON | jq -r '.WebServerHostname') \ && CLI_RESULTS=$(curl --request POST "https://$WEB_SERVER_HOSTNAME/aws_mwaa/ cli" \ --header "Authorization: Bearer $CLI_TOKEN" \ --header "Content-Type: text/plain" \ --data-raw "trigger_dag YOUR_DAG_NAME") \ && echo "Output:" \ && echo $CLI_RESULTS | jq -r '.stdout' | base64 --decode \ && echo "Errors:" \ && echo $CLI_RESULTS | jq
amazon-mwaa-user-guide-032
amazon-mwaa-user-guide.pdf
32
statement from your text file and paste it in your command shell. Note After copying it to your clipboard, you may need to use Edit > Paste from your shell menu. CLI_JSON=$(aws mwaa --region YOUR_REGION create-cli-token -- name YOUR_ENVIRONMENT_NAME) \ && CLI_TOKEN=$(echo $CLI_JSON | jq -r '.CliToken') \ && WEB_SERVER_HOSTNAME=$(echo $CLI_JSON | jq -r '.WebServerHostname') \ && CLI_RESULTS=$(curl --request POST "https://$WEB_SERVER_HOSTNAME/aws_mwaa/ cli" \ --header "Authorization: Bearer $CLI_TOKEN" \ --header "Content-Type: text/plain" \ --data-raw "trigger_dag YOUR_DAG_NAME") \ && echo "Output:" \ && echo $CLI_RESULTS | jq -r '.stdout' | base64 --decode \ && echo "Errors:" \ && echo $CLI_RESULTS | jq -r '.stderr' | base64 --decode Using a curl script 101 Amazon Managed Workflows for Apache Airflow User Guide 2. Substitute the placeholders for YOUR_REGION with the AWS region for your environment, YOUR_DAG_NAME, and YOUR_HOST_NAME. For example, a host name for a public network may look like this (without the https://): 123456a0-0101-2020-9e11-1b159eec9000.c2.us-east-1.airflow.amazonaws.com 3. You should see the following in your command prompt: { "stderr":"<STDERR of the CLI execution (if any), base64 encoded>", "stdout":"<STDOUT of the CLI execution, base64 encoded>" } 4. Substitute the placeholders for YOUR_ENVIRONMENT_NAME and YOUR_DAG_NAME. Using a bash script The following example uses a bash script to call the create-cli-token command in the AWS CLI to create an Apache Airflow CLI token. Apache Airflow v2 1. Copy the contents of the following code sample and save locally as get-cli-token.sh. # brew install jq aws mwaa create-cli-token --name YOUR_ENVIRONMENT_NAME | export CLI_TOKEN=$(jq -r .CliToken) && curl --request POST "https://YOUR_HOST_NAME/aws_mwaa/cli" \ --header "Authorization: Bearer $CLI_TOKEN" \ --header "Content-Type: text/plain" \ --data-raw "dags trigger YOUR_DAG_NAME" 2. Substitute the placeholders in red for YOUR_ENVIRONMENT_NAME, YOUR_HOST_NAME, and YOUR_DAG_NAME. For example, a host name for a public network may look like this (without the https://): 123456a0-0101-2020-9e11-1b159eec9000.c2.us-east-1.airflow.amazonaws.com 3. (optional) macOS and Linux users may need to run the following command to ensure the script is executable. Using a bash script 102 Amazon Managed Workflows for Apache Airflow User Guide chmod +x get-cli-token.sh 4. Run the following script to create an Apache Airflow CLI token. ./get-cli-token.sh Apache Airflow v1 1. Copy the contents of the following code sample and save locally as get-cli-token.sh. # brew install jq aws mwaa create-cli-token --name YOUR_ENVIRONMENT_NAME | export CLI_TOKEN=$(jq -r .CliToken) && curl --request POST "https://YOUR_HOST_NAME/aws_mwaa/cli" \ --header "Authorization: Bearer $CLI_TOKEN" \ --header "Content-Type: text/plain" \ --data-raw "trigger_dag YOUR_DAG_NAME" 2. Substitute the placeholders in red for YOUR_ENVIRONMENT_NAME, YOUR_HOST_NAME, and YOUR_DAG_NAME. For example, a host name for a public network may look like this (without the https://): 123456a0-0101-2020-9e11-1b159eec9000.c2.us-east-1.airflow.amazonaws.com 3. (optional) macOS and Linux users may need to run the following command to ensure the script is executable. chmod +x get-cli-token.sh 4. Run the following script to create an Apache Airflow CLI token. ./get-cli-token.sh Using a Python script The following example uses the boto3 create_cli_token method in a Python script to create an Apache Airflow CLI token and trigger a DAG. You can run this script outside of Amazon MWAA. The only thing you need to do is install the boto3 library. You may want to create a virtual environment Using a Python script 103 Amazon Managed Workflows for Apache Airflow User Guide to install the library. It assumes you have configured AWS authentication credentials for your account. Apache Airflow v2 1. Copy the contents of the following code sample and save locally as create-cli- token.py. """ Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. """ import boto3 import json import requests import base64 mwaa_env_name = 'YOUR_ENVIRONMENT_NAME' dag_name = 'YOUR_DAG_NAME' mwaa_cli_command = 'dags trigger' client = boto3.client('mwaa') mwaa_cli_token = client.create_cli_token( Name=mwaa_env_name ) mwaa_auth_token = 'Bearer ' + mwaa_cli_token['CliToken'] mwaa_webserver_hostname = 'https://{0}/aws_mwaa/ cli'.format(mwaa_cli_token['WebServerHostname']) raw_data = '{0} {1}'.format(mwaa_cli_command, dag_name) Using a Python script 104 Amazon Managed Workflows for Apache Airflow User Guide mwaa_response = requests.post( mwaa_webserver_hostname, headers={ 'Authorization': mwaa_auth_token, 'Content-Type': 'text/plain' }, data=raw_data ) mwaa_std_err_message = base64.b64decode(mwaa_response.json() ['stderr']).decode('utf8') mwaa_std_out_message = base64.b64decode(mwaa_response.json() ['stdout']).decode('utf8') print(mwaa_response.status_code) print(mwaa_std_err_message) print(mwaa_std_out_message) 2. Substitute the placeholders for YOUR_ENVIRONMENT_NAME and YOUR_DAG_NAME. 3. Run the following script to create an Apache Airflow CLI token. python3
amazon-mwaa-user-guide-033
amazon-mwaa-user-guide.pdf
33
SOFTWARE. """ import boto3 import json import requests import base64 mwaa_env_name = 'YOUR_ENVIRONMENT_NAME' dag_name = 'YOUR_DAG_NAME' mwaa_cli_command = 'dags trigger' client = boto3.client('mwaa') mwaa_cli_token = client.create_cli_token( Name=mwaa_env_name ) mwaa_auth_token = 'Bearer ' + mwaa_cli_token['CliToken'] mwaa_webserver_hostname = 'https://{0}/aws_mwaa/ cli'.format(mwaa_cli_token['WebServerHostname']) raw_data = '{0} {1}'.format(mwaa_cli_command, dag_name) Using a Python script 104 Amazon Managed Workflows for Apache Airflow User Guide mwaa_response = requests.post( mwaa_webserver_hostname, headers={ 'Authorization': mwaa_auth_token, 'Content-Type': 'text/plain' }, data=raw_data ) mwaa_std_err_message = base64.b64decode(mwaa_response.json() ['stderr']).decode('utf8') mwaa_std_out_message = base64.b64decode(mwaa_response.json() ['stdout']).decode('utf8') print(mwaa_response.status_code) print(mwaa_std_err_message) print(mwaa_std_out_message) 2. Substitute the placeholders for YOUR_ENVIRONMENT_NAME and YOUR_DAG_NAME. 3. Run the following script to create an Apache Airflow CLI token. python3 create-cli-token.py Apache Airflow v1 1. Copy the contents of the following code sample and save locally as create-cli- token.py. import boto3 import json import requests import base64 mwaa_env_name = 'YOUR_ENVIRONMENT_NAME' dag_name = 'YOUR_DAG_NAME' mwaa_cli_command = 'trigger_dag' client = boto3.client('mwaa') mwaa_cli_token = client.create_cli_token( Using a Python script 105 Amazon Managed Workflows for Apache Airflow User Guide Name=mwaa_env_name ) mwaa_auth_token = 'Bearer ' + mwaa_cli_token['CliToken'] mwaa_webserver_hostname = 'https://{0}/aws_mwaa/ cli'.format(mwaa_cli_token['WebServerHostname']) raw_data = '{0} {1}'.format(mwaa_cli_command, dag_name) mwaa_response = requests.post( mwaa_webserver_hostname, headers={ 'Authorization': mwaa_auth_token, 'Content-Type': 'text/plain' }, data=raw_data ) mwaa_std_err_message = base64.b64decode(mwaa_response.json() ['stderr']).decode('utf8') mwaa_std_out_message = base64.b64decode(mwaa_response.json() ['stdout']).decode('utf8') print(mwaa_response.status_code) print(mwaa_std_err_message) print(mwaa_std_out_message) 2. Substitute the placeholders for YOUR_ENVIRONMENT_NAME and YOUR_DAG_NAME. 3. Run the following script to create an Apache Airflow CLI token. python3 create-cli-token.py What's next? • Explore the Amazon MWAA API operation used to create a CLI token at CreateCliToken. Using the Apache Airflow REST API Amazon Managed Workflows for Apache Airflow (Amazon MWAA) supports interacting with your Apache Airflow environments directly using the Apache Airflow REST API for environments running Apache Airflow v2.4.3 and above. This lets you access and manage your Amazon MWAA What's next? 106 Amazon Managed Workflows for Apache Airflow User Guide environments programmatically, providing a standardized way to invoke data orchestration workflows, manage your DAGs, and monitor the status of various Apache Airflow components such as the metadata database, triggerer, and scheduler. In order to support scalability while using the Apache Airflow REST API, Amazon MWAA provides you with the option to horizontally scale web server capacity to handle increased demand, whether from REST API requests, command line interface (CLI) usage, or more concurrent Apache Airflow user interface (UI) users. For more information on how Amazon MWAA scales web servers, see the section called “Configuring web server auto scaling”. You can use the Apache Airflow REST API to implement the following use-cases for your environments: • Programmatic access – You can now start Apache Airflow DAG runs, manage datasets, and retrieve the status of various components such as the metadata database, triggerers, and schedulers without relying on the Apache Airflow UI or CLI. • Integrate with external applications and microservices – REST API support allows you to build custom solutions that integrate your Amazon MWAA environments with other systems. For example, you can start workflows in response to events from external systems, such as completed database jobs or new user sign-ups. • Centralized monitoring – You can build monitoring dashboards that aggregate the status of your DAGs across multiple Amazon MWAA environments, enabling centralized monitoring and management. For more information about the Apache Airflow REST API, see The Apache Airflow REST API Reference. By using InvokeRestApi, you can access the Apache Airflow REST API using AWS credentials. Alternatively, you can also access it by obtaining a web server access token and then using the token to call it. Note • If you encounter an error with the message "Update your environment to use InvokeRestApi" while using the InvokeRestApi operation, it indicates that you need to update your Amazon MWAA environment. This error occurs when your Amazon MWAA environment is not compatible with the latest changes related to the InvokeRestApi Using the Apache Airflow REST API 107 Amazon Managed Workflows for Apache Airflow User Guide feature. To resolve this issue, update your Amazon MWAA environment to incorporate the necessary changes for the InvokeRestApi feature. • The InvokeRestApi operation has a default timeout duration of 10 seconds. If the operation does not complete within this 10-second timeframe, it will be automatically terminated, and an error will be raised. Ensure that your REST API calls are designed to complete within this timeout period to avoid encountering errors. The following examples show how you to make API calls to the Apache Airflow REST API and start a new DAG run: Topics • Granting access to the Apache Airflow REST API: airflow:InvokeRestApi • Calling the Apache Airflow REST API • Creating a web server session token and calling the Apache Airflow REST API Granting access to the Apache Airflow REST API: airflow:InvokeRestApi To access the Apache Airflow REST API using AWS credential, you must grant the airflow:InvokeRestApi permission in your IAM policy. In the following policy sample, specify the Admin, Op, User, Viewer or the Public
amazon-mwaa-user-guide-034
amazon-mwaa-user-guide.pdf
34
avoid encountering errors. The following examples show how you to make API calls to the Apache Airflow REST API and start a new DAG run: Topics • Granting access to the Apache Airflow REST API: airflow:InvokeRestApi • Calling the Apache Airflow REST API • Creating a web server session token and calling the Apache Airflow REST API Granting access to the Apache Airflow REST API: airflow:InvokeRestApi To access the Apache Airflow REST API using AWS credential, you must grant the airflow:InvokeRestApi permission in your IAM policy. In the following policy sample, specify the Admin, Op, User, Viewer or the Public role in {airflow-role} to customize the level of user access. For more information, see Default Roles in the Apache Airflow reference guide. { "Version": "2012-10-17", "Statement": [ { "Sid": "AllowMwaaRestApiAccess", "Effect": "Allow", "Action": "airflow:InvokeRestApi", "Resource": [ "arn:aws:airflow:{your-region}:YOUR_ACCOUNT_ID:role/{your-environment-name}/ {airflow-role}" ] } ] Granting access to the Apache Airflow REST API: airflow:InvokeRestApi 108 Amazon Managed Workflows for Apache Airflow User Guide } Note While configuring a private web server, the InvokeRestApi action cannot be invoked from outside of a Virtual Private Cloud (VPC). You can use the aws:SourceVpc key to apply more granular access control for this operation. For more information, see aws:SourceVpc. Calling the Apache Airflow REST API This following sample script covers how to use the Apache Airflow REST API to list the available DAGs in your environment and how to create an Apache Airflow variable: import boto3 env_name = "MyAirflowEnvironment" def list_dags(client): request_params = { "Name": env_name, "Path": "/dags", "Method": "GET", "QueryParameters": { "paused": False } } response = client.invoke_rest_api( **request_params ) print("Airflow REST API response: ", response['RestApiResponse']) def create_variable(client): request_params = { "Name": env_name, "Path": "/variables", "Method": "POST", "Body": { Calling the Apache Airflow REST API 109 Amazon Managed Workflows for Apache Airflow User Guide "key": "test-restapi-key", "value": "test-restapi-value", "description": "Test variable created by MWAA InvokeRestApi API", } } response = client.invoke_rest_api( **request_params ) print("Airflow REST API response: ", response['RestApiResponse']) if __name__ == "__main__": client = boto3.client("mwaa") list_dags(client) create_variable(client) Creating a web server session token and calling the Apache Airflow REST API To create a web server access token, use the following Python function. This function first calls the Amazon MWAA API to obtain a web login token. The web login token, which expires after 60 seconds, is then exchanged for a web session token, which lets you access the web server and use the Apache Airflow REST API. If you require more than 10 transactions per second (TPS) of throttling capacity, you can use this method to access the Apache Airflow REST API. Note The session token expires after 12 hours. def get_session_info(region, env_name): logging.basicConfig(level=logging.INFO) try: # Initialize MWAA client and request a web login token mwaa = boto3.client('mwaa', region_name=region) response = mwaa.create_web_login_token(Name=env_name) # Extract the web server hostname and login token web_server_host_name = response["WebServerHostname"] web_token = response["WebToken"] Creating a web server session token and calling the Apache Airflow REST API 110 Amazon Managed Workflows for Apache Airflow User Guide # Construct the URL needed for authentication login_url = f"https://{web_server_host_name}/aws_mwaa/login" login_payload = {"token": web_token} # Make a POST request to the MWAA login url using the login payload response = requests.post( login_url, data=login_payload, timeout=10 ) # Check if login was succesfull if response.status_code == 200: # Return the hostname and the session cookie return ( web_server_host_name, response.cookies["session"] ) else: # Log an error logging.error("Failed to log in: HTTP %d", response.status_code) return None except requests.RequestException as e: # Log any exceptions raised during the request to the MWAA login endpoint logging.error("Request failed: %s", str(e)) return None except Exception as e: # Log any other unexpected exceptions logging.error("An unexpected error occurred: %s", str(e)) return None Once authentication is complete, you have the credentials to start sending requests to the API endpoints. In the example below, use the endpoint dags/{dag_id}/dagRuns. def trigger_dag(region, env_name, dag_name): """ Triggers a DAG in a specified MWAA environment using the Airflow REST API. Args: region (str): AWS region where the MWAA environment is hosted. Creating a web server session token and calling the Apache Airflow REST API 111 Amazon Managed Workflows for Apache Airflow User Guide env_name (str): Name of the MWAA environment. dag_name (str): Name of the DAG to trigger. """ logging.info(f"Attempting to trigger DAG {dag_name} in environment {env_name} at region {region}") # Retrieve the web server hostname and session cookie for authentication try: web_server_host_name, session_cookie = get_session_info(region, env_name) if not session_cookie: logging.error("Authentication failed, no session cookie retrieved.") return except Exception as e: logging.error(f"Error retrieving session info: {str(e)}") return # Prepare headers and payload for the request cookies = {"session": session_cookie} json_body = {"conf": {}} # Construct the URL for triggering the DAG url = f"https://{web_server_host_name}/api/v1/dags/{dag_id}/dagRuns" # Send the POST request to trigger the DAG try: response = requests.post(url, cookies=cookies, json=json_body) # Check the response status code to determine if the DAG was triggered
amazon-mwaa-user-guide-035
amazon-mwaa-user-guide.pdf
35
DAG {dag_name} in environment {env_name} at region {region}") # Retrieve the web server hostname and session cookie for authentication try: web_server_host_name, session_cookie = get_session_info(region, env_name) if not session_cookie: logging.error("Authentication failed, no session cookie retrieved.") return except Exception as e: logging.error(f"Error retrieving session info: {str(e)}") return # Prepare headers and payload for the request cookies = {"session": session_cookie} json_body = {"conf": {}} # Construct the URL for triggering the DAG url = f"https://{web_server_host_name}/api/v1/dags/{dag_id}/dagRuns" # Send the POST request to trigger the DAG try: response = requests.post(url, cookies=cookies, json=json_body) # Check the response status code to determine if the DAG was triggered successfully if response.status_code == 200: logging.info("DAG triggered successfully.") else: logging.error(f"Failed to trigger DAG: HTTP {response.status_code} - {response.text}") except requests.RequestException as e: logging.error(f"Request to trigger DAG failed: {str(e)}") if __name__ == "__main__": logging.basicConfig(level=logging.INFO) # Check if the correct number of arguments is provided if len(sys.argv) != 4: logging.error("Incorrect usage. Proper format: python script_name.py {region} {env_name} {dag_name}") Creating a web server session token and calling the Apache Airflow REST API 112 Amazon Managed Workflows for Apache Airflow User Guide sys.exit(1) region = sys.argv[1] env_name = sys.argv[2] dag_name = sys.argv[3] # Trigger the DAG with the provided arguments trigger_dag(region, env_name, dag_name) Apache Airflow CLI command reference This topic describes the supported and unsupported Apache Airflow CLI commands on Amazon Managed Workflows for Apache Airflow. Contents • Prerequisites • Access • AWS CLI • What's changed in v2 • Supported CLI commands • Supported commands • Using commands that parse DAGs • Sample code • Set, get or delete an Apache Airflow v2 variable • Add a configuration when triggering a DAG • Run CLI commands on an SSH tunnel to a bastion host • Samples in GitHub and AWS tutorials Prerequisites The following section describes the preliminary steps required to use the commands and scripts on this page. Apache Airflow CLI command reference 113 Amazon Managed Workflows for Apache Airflow User Guide Access • AWS account access in AWS Identity and Access Management (IAM) to the Amazon MWAA permissions policy in Apache Airflow UI access policy: AmazonMWAAWebServerAccess. • AWS account access in AWS Identity and Access Management (IAM) to the Amazon MWAA permissions policy Full API and console access policy: AmazonMWAAFullApiAccess. AWS CLI The AWS Command Line Interface (AWS CLI) is an open source tool that enables you to interact with AWS services using commands in your command-line shell. To complete the steps on this page, you need the following: • AWS CLI – Install version 2. • AWS CLI – Quick configuration with aws configure. What's changed in v2 • New: Airflow CLI command structure. The Apache Airflow v2 CLI is organized so that related commands are grouped together as subcommands, which means you need to update Apache Airflow v1 scripts if you want to upgrade to Apache Airflow v2. For example, unpause in Apache Airflow v1 is now dags unpause in Apache Airflow v2. To learn more, see Airflow CLI changes in 2 in the Apache Airflow reference guide. Supported CLI commands The following section lists the Apache Airflow CLI commands available on Amazon MWAA. Supported commands Apache Airflow v2 Minor versions Command v2.0+ v2.0+ cheat-sheet connections add What's changed in v2 114 Amazon Managed Workflows for Apache Airflow User Guide Minor versions Command v2.0+ connections delete v2.2+ (note) dags backfill v2.0+ dags delete v2.2+ (note) dags list v2.0+ v2.6+ v2.2+ (note) v2.2+ (note) v2.0+ v2.0+ v2.4+ v2.0+ v2.0+ v2.0+ v2.0+ v2.0+ v2.4+ v2.0+ v2.0+ v2.0+ dags list-jobs dags list-import-errors dags list-runs dags next-execution dags pause dags report dags reserialize dags show dags state dags test dags trigger dags unpause db clean providers behaviours providers get providers hooks Supported CLI commands 115 Amazon Managed Workflows for Apache Airflow User Guide Minor versions Command v2.0+ v2.0+ v2.8+ v2.6+ v2.7+ v2.0+ v2.6+ v2.6+ v2.6+ v2.0+ v2.0+ v2.0+ v2.0+ v2.0+ v2.0+ v2.0+ v2.0+ v2.0+ v2.0+ v2.0+ providers links providers list providers notifications providers secrets providers triggerer providers widgets roles add-perms roles del-perms roles create roles list tasks clear tasks failed-deps tasks list tasks render tasks state tasks states-for-dag-run tasks test variables delete variables get variables set Supported CLI commands 116 Amazon Managed Workflows for Apache Airflow User Guide Minor versions Command v2.0+ v2.0+ variables list version Using commands that parse DAGs If your environment is running Apache Airflow v1.10.12 or v2.0.2, CLI commands that parse DAGs will fail if the DAG uses plugins that depend on packages installed through a requirements.txt: Apache Airflow v2.0.2 • dags backfill • dags list • dags list-runs • dags next-execution You can use these CLI commands if your DAGs do not use plugins that depend on packages installed through a requirements.txt. Sample code The following section contains examples of different ways to use the Apache Airflow CLI. Set, get or delete an Apache Airflow v2 variable You can
amazon-mwaa-user-guide-036
amazon-mwaa-user-guide.pdf
36
list version Using commands that parse DAGs If your environment is running Apache Airflow v1.10.12 or v2.0.2, CLI commands that parse DAGs will fail if the DAG uses plugins that depend on packages installed through a requirements.txt: Apache Airflow v2.0.2 • dags backfill • dags list • dags list-runs • dags next-execution You can use these CLI commands if your DAGs do not use plugins that depend on packages installed through a requirements.txt. Sample code The following section contains examples of different ways to use the Apache Airflow CLI. Set, get or delete an Apache Airflow v2 variable You can use the following sample code to set, get or delete a variable in the format of <script> <mwaa env name> get | set | delete <variable> <variable value> </variable> </variable>. [ $# -eq 0 ] && echo "Usage: $0 MWAA environment name " && exit if [[ $2 == "" ]]; then dag="variables list" elif [ $2 == "get" ] || [ $2 == "delete" ] || [ $2 == "set" ]; then dag="variables $2 $3 $4 $5" Sample code 117 Amazon Managed Workflows for Apache Airflow User Guide else echo "Not a valid command" exit 1 fi CLI_JSON=$(aws mwaa --region $AWS_REGION create-cli-token --name $1) \ && CLI_TOKEN=$(echo $CLI_JSON | jq -r '.CliToken') \ && WEB_SERVER_HOSTNAME=$(echo $CLI_JSON | jq -r '.WebServerHostname') \ && CLI_RESULTS=$(curl --request POST "https://$WEB_SERVER_HOSTNAME/aws_mwaa/cli" \ --header "Authorization: Bearer $CLI_TOKEN" \ --header "Content-Type: text/plain" \ --data-raw "$dag" ) \ && echo "Output:" \ && echo $CLI_RESULTS | jq -r '.stdout' | base64 --decode \ && echo "Errors:" \ && echo $CLI_RESULTS | jq -r '.stderr' | base64 --decode Add a configuration when triggering a DAG You can use the following sample code with Apache Airflow v1 and Apache Airflow v2 to add a configuration when triggering a DAG, such as airflow trigger_dag 'dag_name' —conf '{"key":"value"}'. import boto3 import json import requests import base64 mwaa_env_name = 'YOUR_ENVIRONMENT_NAME' dag_name = 'YOUR_DAG_NAME' key = "YOUR_KEY" value = "YOUR_VALUE" conf = "{\"" + key + "\":\"" + value + "\"}" client = boto3.client('mwaa') mwaa_cli_token = client.create_cli_token( Name=mwaa_env_name ) mwaa_auth_token = 'Bearer ' + mwaa_cli_token['CliToken'] mwaa_webserver_hostname = 'https://{0}/aws_mwaa/ cli'.format(mwaa_cli_token['WebServerHostname']) Sample code 118 Amazon Managed Workflows for Apache Airflow User Guide raw_data = "trigger_dag {0} -c '{1}'".format(dag_name, conf) mwaa_response = requests.post( mwaa_webserver_hostname, headers={ 'Authorization': mwaa_auth_token, 'Content-Type': 'text/plain' }, data=raw_data ) mwaa_std_err_message = base64.b64decode(mwaa_response.json()['stderr']).decode('utf8') mwaa_std_out_message = base64.b64decode(mwaa_response.json()['stdout']).decode('utf8') print(mwaa_response.status_code) print(mwaa_std_err_message) print(mwaa_std_out_message) Run CLI commands on an SSH tunnel to a bastion host The following example shows how to run Airflow CLI commands using an SSH tunnel proxy to a Linux Bastion Host. Using curl 1. 2. ssh -D 8080 -f -C -q -N YOUR_USER@YOUR_BASTION_HOST curl -x socks5h://0:8080 --request POST https://YOUR_HOST_NAME/aws_mwaa/cli -- header YOUR_HEADERS --data-raw YOUR_CLI_COMMAND Samples in GitHub and AWS tutorials • Working with Apache Airflow v2.0.2 parameters and variables in Amazon Managed Workflows for Apache Airflow • Interacting with Apache Airflow v1.10.12 on Amazon MWAA via the command line • Interactive Commands with Apache Airflow v1.10.12 on Amazon MWAA and Bash Operator on GitHub Sample code 119 Amazon Managed Workflows for Apache Airflow User Guide Managing connections to Apache Airflow This chapter describes how to configure an Apache Airflow connection for an Amazon Managed Workflows for Apache Airflow environment. Topics • Overview of Apache Airflow variables and connections • Apache Airflow provider packages installed on Amazon MWAA environments • Overview of connection types • Configuring an Apache Airflow connection using a AWS Secrets Manager secret Overview of Apache Airflow variables and connections In some cases, you may want to specify additional connections or variables for an environment, such as an AWS profile, or to add your execution role in a connection object in the Apache Airflow metastore, then refer to the connection from within a DAG. • Self-managed Apache Airflow. On a self-managed Apache Airflow installation, you set Apache Airflow configuration options in airflow.cfg. [secrets] backend = airflow.providers.amazon.aws.secrets.secrets_manager.SecretsManagerBackend backend_kwargs = {"connections_prefix" : "airflow/connections", "variables_prefix" : "airflow/variables"} • Apache Airflow on Amazon MWAA. On Amazon MWAA, you need to add these configuration settings as Apache Airflow configuration options on the Amazon MWAA console. Apache Airflow configuration options are written as environment variables to your environment and override all other existing configurations for the same setting. Apache Airflow provider packages installed on Amazon MWAA environments Amazon MWAA installs provider extras for Apache Airflow v2 and above connection types when you create a new environment. Installing provider packages allows you to view a connection type Overview 120 Amazon Managed Workflows for Apache Airflow User Guide in the Apache Airflow UI. It also means you don't need to specify these packages as a Python dependency in your requirements.txt file. This page lists the Apache Airflow provider packages installed by Amazon MWAA for all Apache Airflow v2 environments. Note For Apache Airflow v2 and above, Amazon MWAA installs Watchtower version
amazon-mwaa-user-guide-037
amazon-mwaa-user-guide.pdf
37
Airflow provider packages installed on Amazon MWAA environments Amazon MWAA installs provider extras for Apache Airflow v2 and above connection types when you create a new environment. Installing provider packages allows you to view a connection type Overview 120 Amazon Managed Workflows for Apache Airflow User Guide in the Apache Airflow UI. It also means you don't need to specify these packages as a Python dependency in your requirements.txt file. This page lists the Apache Airflow provider packages installed by Amazon MWAA for all Apache Airflow v2 environments. Note For Apache Airflow v2 and above, Amazon MWAA installs Watchtower version 2.0.1 after perfming pip3 install -r requirements.txt, to ensure compatibility with CloudWatch logging is not overridden by other Python library installations. Contents • Provider packages for Apache Airflow v2.10.1 connections • Provider packages for Apache Airflow v2.9.2 connections • Provider packages for Apache Airflow v2.8.1 connections • Provider packages for Apache Airflow v2.7.2 connections • Provider packages for Apache Airflow v2.6.3 connections • Provider packages for Apache Airflow v2.5.1 connections • Provider packages for Apache Airflow v2.4.3 connections • Provider packages for Apache Airflow v2.2.2 connections • Provider packages for Apache Airflow v2.0.2 connections • Specifying newer provider packages Provider packages for Apache Airflow v2.10.1 connections When you create an Amazon MWAA environment in Apache Airflow v2.10.1, Amazon MWAA installs the following provider packages used for Apache Airflow connections. Note You can specify the latest supported version of apache-airflow-providers-amazon to upgrade this provider. For more information on specifying newer versions, see the section called “Specifying newer provider packages”. Provider packages for Apache Airflow v2.10.1 connections 121 Amazon Managed Workflows for Apache Airflow User Guide Connection type AWS Connection Package apache-airflow-providers-amazon[aiob otocore]==8.28.0 Postgres Connection apache-airflow-providers-postgres==5.12.0 FTP Connection Fab Connection apache-airflow-providers-ftp==3.11.0 apache-airflow-providers-fab==1.3.0 Celery Connection apache-airflow-providers-celery==3.8.1 HTTP Connection IMAP Connection Common SQL SQLite Connection SMTP Connection apache-airflow-providers-http==4.13.0 apache-airflow-providers-imap==3.7.0 apache-airflow-providers-common-sql= =1.16.0 apache-airflow-providers-sqlite==3.9.0 apache-airflow-providers-smtp==1.8.0 Provider packages for Apache Airflow v2.9.2 connections When you create an Amazon MWAA environment in Apache Airflow v2.9.2, Amazon MWAA installs the following provider packages used for Apache Airflow connections. Note You can specify the latest supported version of apache-airflow-providers-amazon to upgrade this provider. For more information on specifying newer versions, see the section called “Specifying newer provider packages”. Provider packages for Apache Airflow v2.9.2 connections 122 Amazon Managed Workflows for Apache Airflow User Guide Connection type AWS Connection Package apache-airflow-providers-amazon[aiob otocore]==8.24.0 Postgres Connection apache-airflow-providers-postgres==5.11.1 FTP Connection Fab Connection apache-airflow-providers-ftp==3.9.1 apache-airflow-providers-fab==1.1.1 Celery Connection apache-airflow-providers-celery==3.7.2 HTTP Connection IMAP Connection Common SQL SQLite Connection SMTP Connection apache-airflow-providers-http==4.11.1 apache-airflow-providers-imap==3.6.1 apache-airflow-providers-common-sql= =1.14.0 apache-airflow-providers-sqlite==3.8.1 apache-airflow-providers-smtp==1.7.1 Provider packages for Apache Airflow v2.8.1 connections When you create an Amazon MWAA environment in Apache Airflow v2.8.1, Amazon MWAA installs the following provider packages used for Apache Airflow connections. Note You can specify the latest supported version of apache-airflow-providers-amazon to upgrade this provider. For more information on specifying newer versions, see the section called “Specifying newer provider packages”. Provider packages for Apache Airflow v2.8.1 connections 123 Amazon Managed Workflows for Apache Airflow User Guide Connection type AWS Connection Package apache-airflow-providers-amazon[aiob otocore]==8.16.0 Postgres Connection apache-airflow-providers-postgres==5.10.0 FTP Connection Celery Connection HTTP Connection IMAP Connection Common SQL apache-airflow-providers-ftp==3.7.0 apache-airflow-providers-celery==3.5.1 apache-airflow-providers-http==4.8.0 apache-airflow-providers-imap==3.5.0 apache-airflow-providers-common-sql= =1.10.0 SQLite Connection apache-airflow-providers-sqlite==3.7.0 Provider packages for Apache Airflow v2.7.2 connections When you create an Amazon MWAA environment in Apache Airflow v2.7.2, Amazon MWAA installs the following provider packages used for Apache Airflow connections. Note You can specify the latest supported version of apache-airflow-providers-amazon to upgrade this provider. For more information on specifying newer versions, see the section called “Specifying newer provider packages”. Connection type AWS Connection Package apache-airflow-providers-amazon[aiob otocore]==8.7.1 Postgres Connection apache-airflow-providers-postgres==5.6.1 Provider packages for Apache Airflow v2.7.2 connections 124 Amazon Managed Workflows for Apache Airflow User Guide Connection type FTP Connection Package apache-airflow-providers-ftp==3.5.2 Celery Connection apache-airflow-providers-celery==3.3.4 HTTP Connection IMAP Connection Common SQL apache-airflow-providers-http==4.5.2 apache-airflow-providers-imap==3.3.2 apache-airflow-providers-common-sql==1.7.2 SQLite Connection apache-airflow-providers-sqlite==3.4.3 Provider packages for Apache Airflow v2.6.3 connections When you create an Amazon MWAA environment in Apache Airflow v2.6.3, Amazon MWAA installs the following provider packages used for Apache Airflow connections. Note You can specify the latest supported version of apache-airflow-providers-amazon to upgrade this provider. For more information on specifying newer versions, see the section called “Specifying newer provider packages”. Connection type AWS Connection Package apache-airflow-providers-amazon[aiob otocore]==8.2.0 Postgres Connection apache-airflow-providers-postgres==5.5.1 FTP Connection Celery Connection HTTP Connection apache-airflow-providers-ftp==3.4.2 apache-airflow-providers-celery==3.2.1 apache-airflow-providers-http==4.4.2 Provider packages for Apache Airflow v2.6.3 connections 125 Amazon Managed Workflows for Apache Airflow User Guide Connection type IMAP Connection Common SQL Package apache-airflow-providers-imap==3.2.2 apache-airflow-providers-common-sql==1.5.2 SQLite Connection apache-airflow-providers-sqlite==3.4.2 Provider packages for Apache Airflow v2.5.1 connections When you create an Amazon MWAA environment in Apache Airflow v2.5.1, Amazon MWAA installs the following provider packages used for Apache Airflow connections. Note You can specify the latest supported version of apache-airflow-providers-amazon to upgrade this provider. For more information on specifying newer versions, see the section called “Specifying newer provider packages”. Connection type AWS Connection Package apache-airflow-providers-amazon==7.1.0
amazon-mwaa-user-guide-038
amazon-mwaa-user-guide.pdf
38
Celery Connection HTTP Connection apache-airflow-providers-ftp==3.4.2 apache-airflow-providers-celery==3.2.1 apache-airflow-providers-http==4.4.2 Provider packages for Apache Airflow v2.6.3 connections 125 Amazon Managed Workflows for Apache Airflow User Guide Connection type IMAP Connection Common SQL Package apache-airflow-providers-imap==3.2.2 apache-airflow-providers-common-sql==1.5.2 SQLite Connection apache-airflow-providers-sqlite==3.4.2 Provider packages for Apache Airflow v2.5.1 connections When you create an Amazon MWAA environment in Apache Airflow v2.5.1, Amazon MWAA installs the following provider packages used for Apache Airflow connections. Note You can specify the latest supported version of apache-airflow-providers-amazon to upgrade this provider. For more information on specifying newer versions, see the section called “Specifying newer provider packages”. Connection type AWS Connection Package apache-airflow-providers-amazon==7.1.0 Postgres Connection apache-airflow-providers-postgres==5.4.0 FTP Connection Celery Connection HTTP Connection IMAP Connection Common SQL apache-airflow-providers-ftp==3.3.0 apache-airflow-providers-celery==3.1.0 apache-airflow-providers-http==4.1.1 apache-airflow-providers-imap==3.1.1 apache-airflow-providers-common-sql==1.3.3 SQLite Connection apache-airflow-providers-sqlite==3.3.1 Provider packages for Apache Airflow v2.5.1 connections 126 Amazon Managed Workflows for Apache Airflow User Guide Provider packages for Apache Airflow v2.4.3 connections When you create an Amazon MWAA environment in Apache Airflow v2.4.3, Amazon MWAA installs the following provider packages used for Apache Airflow connections. Connection type AWS Connection Package apache-airflow-providers-amazon==6.0.0 Postgres Connection apache-airflow-providers-postgres==5.2.2 FTP Connection Celery Connection HTTP Connection IMAP Connection Common SQL apache-airflow-providers-ftp==3.1.0 apache-airflow-providers-celery==3.0.0 apache-airflow-providers-http==4.0.0 apache-airflow-providers-imap==3.0.0 apache-airflow-providers-common-sql==1.2.0 SQLite Connection apache-airflow-providers-sqlite==3.2.1 Provider packages for Apache Airflow v2.2.2 connections When you create an Amazon MWAA environment in Apache Airflow v2.2.2, Amazon MWAA installs the following provider packages used for Apache Airflow connections. Connection type AWS Connection Package apache-airflow-providers-amazon==2.4.0 Postgres Connection apache-airflow-providers-postgres==2.3.0 FTP Connection Celery Connection HTTP Connection apache-airflow-providers-ftp==2.0.1 apache-airflow-providers-celery==2.1.0 apache-airflow-providers-http==2.0.1 Provider packages for Apache Airflow v2.4.3 connections 127 Amazon Managed Workflows for Apache Airflow User Guide Connection type IMAP Connection Package apache-airflow-providers-imap==2.0.1 SQLite Connection apache-airflow-providers-sqlite==2.0.1 Provider packages for Apache Airflow v2.0.2 connections When you create an Amazon MWAA environment in Apache Airflow v2.0.2, Amazon MWAA installs the following provider packages used for Apache Airflow connections. Connection type Package Tableau Connection apache-airflow-providers-tableau==1.0.0 Databricks Connection apache-airflow-providers-databricks==1.0.1 SSH Connection apache-airflow-providers-ssh==1.3.0 Postgres Connection apache-airflow-providers-postgres==1.0.2 Docker Connection apache-airflow-providers-docker==1.2.0 Oracle Connection Presto Connection SFTP Connection apache-airflow-providers-oracle==1.1.0 apache-airflow-providers-presto==1.0.2 apache-airflow-providers-sftp==1.2.0 Specifying newer provider packages Beginning with Apache Airflow v2.7.2, your requirements file must include a --constraint statement. If you do not provide a constraint, Amazon MWAA will specify one for you to ensure the packages listed in your requirements are compatible with the version of Apache Airflow you are using. Apache Airflow constraints files specify the provider versions available at the time of a Apache Airflow release. In many cases, however, newer providers are compatible with that version of Provider packages for Apache Airflow v2.0.2 connections 128 Amazon Managed Workflows for Apache Airflow User Guide Apache Airflow. Because you must use constraints, to specify a newer version of a provider package, you can modify the constraints file for a specific provider version: 1. Download the version-specific constraints file from https://raw.githubusercontent.com/ apache/airflow/constraints-2.7.2/constraints-3.11.txt" 2. Modify the apache-airflow-providers-amazon version in the constraints file to the version you want to use. 3. Save the modified constraints file to the Amazon S3 dags folder of your Amazon MWAA environment, for example, as constraints-3.11-updated.txt 4. Specify your requirements as shown in the following. --constraint "/usr/local/airflow/dags/constraints-3.11-updated.txt" apache-airflow-providers-amazon==version-number Note If you are using a private web server, we recommend you package the required libraries as WHL files by using the Amazon MWAA local-runner. Overview of connection types Apache Airflow stores connections as a connection URI string. It provides a connections template in the Apache Airflow UI to generate the connection URI string, regardless of the connection type. If a connection template is not available in the Apache Airflow UI, an alternate connection template can be used to generate this connection URI string, such as using the HTTP connection template. The primary difference is the URI prefix, such as my-conn-type://, which Apache Airflow providers typically ignore for a connection. This page describes how to use connection templates in the Apache Airflow UI interchangeably for different connection types. Warning Do not overwrite the aws_default connection in Amazon MWAA. Amazon MWAA uses this connection to perform a variety of critical tasks, such as collecting task logs. Connection types 129 Amazon Managed Workflows for Apache Airflow User Guide Overwriting this connection might result in data loss and disruptions to your environment availability. Topics • Example connection URI string • Example connection template • Example using an HTTP connection template for a Jdbc connection Example connection URI string The following example shows a connection URI string for the MySQL connection type. 'mysql://288888a0-50a0-888-9a88-1a111aaa0000.a1.us-east-1.airflow.amazonaws.com %2Fhome?role_arn=arn%3Aaws%3Aiam%3A%3A001122332255%3Arole%2Fservice-role%2FAmazonMWAA- MyAirflowEnvironment-iAaaaA&region_name=us-east-1' Example connection template The following example shows the HTTP connection template in the Apache Airflow UI. Apache Airflow v2 The following example shows the HTTP connection template for Apache Airflow v2 in the Apache Airflow UI. Example connection URI string 130 Amazon Managed Workflows for Apache Airflow User Guide Apache Airflow v1 The following example shows the HTTP connection template for Apache Airflow v1 in the Apache Airflow UI. Example connection template 131 Amazon Managed
amazon-mwaa-user-guide-039
amazon-mwaa-user-guide.pdf
39
template for a Jdbc connection Example connection URI string The following example shows a connection URI string for the MySQL connection type. 'mysql://288888a0-50a0-888-9a88-1a111aaa0000.a1.us-east-1.airflow.amazonaws.com %2Fhome?role_arn=arn%3Aaws%3Aiam%3A%3A001122332255%3Arole%2Fservice-role%2FAmazonMWAA- MyAirflowEnvironment-iAaaaA&region_name=us-east-1' Example connection template The following example shows the HTTP connection template in the Apache Airflow UI. Apache Airflow v2 The following example shows the HTTP connection template for Apache Airflow v2 in the Apache Airflow UI. Example connection URI string 130 Amazon Managed Workflows for Apache Airflow User Guide Apache Airflow v1 The following example shows the HTTP connection template for Apache Airflow v1 in the Apache Airflow UI. Example connection template 131 Amazon Managed Workflows for Apache Airflow User Guide Example using an HTTP connection template for a Jdbc connection The following example shows how to use the HTTP connection template for a Jdbc connection type in Apache Airflow v2.0.2, and the same values in the Jdbc connection template for Apache Airflow v1.10.12 in the Apache Airflow UI. Apache Airflow v2 The following example shows the connection URI string generated by Apache Airflow for the example in this section. http://myconnectionurl/some/path&login=mylogin&extra__jdbc__dry__path=usr/local/ airflow/dags/classpath/redshif- jdbc42-2.0.0.1.jar&extra__jdbc__dry__clsname=redshift-jdbc42-2.0.0.1 The following example shows how to use the HTTP connection template for a Jdbc connection for Apache Airflow v2 in the Apache Airflow UI. Example using an HTTP connection template for a Jdbc connection 132 Amazon Managed Workflows for Apache Airflow User Guide Apache Airflow v1 The following example shows the connection URI string generated by Apache Airflow for the example in this section. jdbc://myconnectionurl/some/path&login=mylogin&extra__jdbc__dry__path=usr/local/ airflow/dags/classpath/redshif- jdbc42-2.0.0.1.jar&extra__jdbc__dry__clsname=redshift-jdbc42-2.0.0.1 The following example shows the Jdbc connection template for Apache Airflow v1.10.12 in the Apache Airflow UI. Example using an HTTP connection template for a Jdbc connection 133 Amazon Managed Workflows for Apache Airflow User Guide Configuring an Apache Airflow connection using a AWS Secrets Manager secret AWS Secrets Manager is a supported alternative Apache Airflow backend on an Amazon Managed Workflows for Apache Airflow environment. This topic shows how to use AWS Secrets Manager to securely store secrets for Apache Airflow variables and an Apache Airflow connection on Amazon Managed Workflows for Apache Airflow. Note • You will be charged for the secrets you create. For more information on Secrets Manager pricing, see AWS Pricing. • AWS Systems Manager Parameter Store is also supported as a secrets backend in Amazon MWAA. For more information, see Amazon Provider Package documentation. Contents Configuring Secrets Manager 134 Amazon Managed Workflows for Apache Airflow User Guide • Step one: Provide Amazon MWAA with permission to access Secrets Manager secret keys • Step two: Create the Secrets Manager backend as an Apache Airflow configuration option • Step three: Generate an Apache Airflow AWS connection URI string • Step four: Add the variables in Secrets Manager • Step five: Add the connection in Secrets Manager • Sample code • Resources • What's next? Step one: Provide Amazon MWAA with permission to access Secrets Manager secret keys The execution role for your Amazon MWAA environment needs read access to the secret key in AWS Secrets Manager. The following IAM policy allows read-write access using the AWS managed SecretsManagerReadWrite policy. To attach the policy to your execution role 1. Open the Environments page on the Amazon MWAA console. 2. Choose an environment. 3. Choose your execution role on the Permissions pane. 4. Choose Attach policies. 5. Type SecretsManagerReadWrite in the Filter policies text field. 6. Choose Attach policy. If you do not want to use an AWS managed permission policy, you can directly update your environment's execution role to allow any level of access to your Secrets Manager resources. For example, the following policy statement grants read access to all secrets you create in a specific AWS Region in Secrets Manager. { "Version": "2012-10-17", "Statement": [ Step one: Provide Amazon MWAA with permission to access Secrets Manager secret keys 135 Amazon Managed Workflows for Apache Airflow User Guide { "Effect": "Allow", "Action": [ "secretsmanager:GetResourcePolicy", "secretsmanager:GetSecretValue", "secretsmanager:DescribeSecret", "secretsmanager:ListSecretVersionIds" ], "Resource": "arn:aws:secretsmanager:us-west-2:012345678910:secret:*" }, { "Effect": "Allow", "Action": "secretsmanager:ListSecrets", "Resource": "*" } ] } Step two: Create the Secrets Manager backend as an Apache Airflow configuration option The following section describes how to create an Apache Airflow configuration option on the Amazon MWAA console for the AWS Secrets Manager backend. If you're using a configuration setting of the same name in airflow.cfg, the configuration you create in the following steps will take precedence and override the configuration settings. 1. Open the Environments page on the Amazon MWAA console. 2. Choose an environment. 3. Choose Edit. 4. Choose Next. 5. Choose Add custom configuration in the Airflow configuration options pane. Add the following key-value pairs: a. secrets.backend: airflow.providers.amazon.aws.secrets.secrets_manager.SecretsManagerBackend b. secrets.backend_kwargs: {"connections_prefix" : "airflow/ connections", "variables_prefix" : "airflow/variables"} This configures Apache Airflow to look for connection strings and variables at airflow/connections/* and airflow/variables/* paths. Step two: Create the Secrets Manager backend as an Apache Airflow configuration option
amazon-mwaa-user-guide-040
amazon-mwaa-user-guide.pdf
40
setting of the same name in airflow.cfg, the configuration you create in the following steps will take precedence and override the configuration settings. 1. Open the Environments page on the Amazon MWAA console. 2. Choose an environment. 3. Choose Edit. 4. Choose Next. 5. Choose Add custom configuration in the Airflow configuration options pane. Add the following key-value pairs: a. secrets.backend: airflow.providers.amazon.aws.secrets.secrets_manager.SecretsManagerBackend b. secrets.backend_kwargs: {"connections_prefix" : "airflow/ connections", "variables_prefix" : "airflow/variables"} This configures Apache Airflow to look for connection strings and variables at airflow/connections/* and airflow/variables/* paths. Step two: Create the Secrets Manager backend as an Apache Airflow configuration option 136 Amazon Managed Workflows for Apache Airflow User Guide You can use a lookup pattern to reduces the number of API calls Amazon MWAA makes to Secrets Manager on your behalf. If you do not specify a lookup pattern, Apache Airflow searches for all connections and variables in the configured backend. By specifying a pattern, you narrow the possible paths that Apache Airflow looks. This lowers your costs when using Secrets Manager with Amazon MWAA. To specify a lookup pattern, specify the connections_lookup_pattern and variables_lookup_pattern parameters. These parameters accept a RegEx string as input. For example, to look for secrets that start with test, enter the following for secrets.backend_kwargs: { "connections_prefix": "airflow/connections", "connections_lookup_pattern": "^test", "variables_prefix" : "airflow/variables", "variables_lookup_pattern": "^test" } Note To use connections_lookup_pattern and variables_lookup_pattern, you must install apache-airflow-providers-amazon version 7.3.0 or higher. For more information on updating provder pacakges for to newer versions, see the section called “Specifying newer provider packages”. 6. Choose Save. Step three: Generate an Apache Airflow AWS connection URI string To create a connection string, use the "tab" key on your keyboard to indent the key-value pairs in the Connection object. We also recommend creating a variable for the extra object in your shell session. The following section walks you through the steps to generate an Apache Airflow connection URI string for an Amazon MWAA environment using Apache Airflow or a Python script. Apache Airflow CLI The following shell session uses your local Airflow CLI to generate a connection string. If you don't have the CLI installed, we recommend using the Python script. Step three: Generate an Apache Airflow AWS connection URI string 137 Amazon Managed Workflows for Apache Airflow User Guide 1. Open a Python shell session: python3 2. Enter the following command: >>> import json 3. Enter the following command: >>> from airflow.models.connection import Connection 4. Create a variable in your shell session for the extra object. Substitute the sample values in YOUR_EXECUTION_ROLE_ARN with the execution role ARN, and the region in YOUR_REGION (such as us-east-1). >>> extra=json.dumps({'role_arn': 'YOUR_EXECUTION_ROLE_ARN', 'region_name': 'YOUR_REGION'}) 5. Create the connection object. Substitute the sample value in myconn with the name of the Apache Airflow connection. >>> myconn = Connection( 6. Use the "tab" key on your keyboard to indent each of the following key-value pairs in your connection object. Substitute the sample values in red. a. Specify the AWS connection type: ... conn_id='aws', b. Specify the Apache Airflow database option: ... conn_type='mysql', c. Specify the Apache Airflow UI URL on Amazon MWAA: ... host='288888a0-50a0-888-9a88-1a111aaa0000.a1.us- east-1.airflow.amazonaws.com/home', Step three: Generate an Apache Airflow AWS connection URI string 138 Amazon Managed Workflows for Apache Airflow User Guide d. Specify the AWS access key ID (username) to login to Amazon MWAA: ... login='YOUR_AWS_ACCESS_KEY_ID', e. Specify the AWS secret access key (password) to login to Amazon MWAA: ... password='YOUR_AWS_SECRET_ACCESS_KEY', f. Specify the extra shell session variable: ... extra=extra g. Close the connection object. ... ) 7. Print the connection URI string: >>> myconn.get_uri() You should see the connection URI string in the response: 'mysql://288888a0-50a0-888-9a88-1a111aaa0000.a1.us-east-1.airflow.amazonaws.com %2Fhome?role_arn=arn%3Aaws%3Aiam%3A%3A001122332255%3Arole%2Fservice-role %2FAmazonMWAA-MyAirflowEnvironment-iAaaaA&region_name=us-east-1' Python script The following Python script does not require the Apache Airflow CLI. 1. Copy the contents of the following code sample and save locally as mwaa_connection.py. import urllib.parse conn_type = 'YOUR_DB_OPTION' host = 'YOUR_MWAA_AIRFLOW_UI_URL' port = 'YOUR_PORT' login = 'YOUR_AWS_ACCESS_KEY_ID' password = 'YOUR_AWS_SECRET_ACCESS_KEY' Step three: Generate an Apache Airflow AWS connection URI string 139 Amazon Managed Workflows for Apache Airflow User Guide role_arn = urllib.parse.quote_plus('YOUR_EXECUTION_ROLE_ARN') region_name = 'YOUR_REGION' conn_string = '{0}://{1}:{2}@{3}:{4}? role_arn={5}&region_name={6}'.format(conn_type, login, password, host, port, role_arn, region_name) print(conn_string) 2. Substitute the placeholders in red. 3. Run the following script to generate a connection string. python3 mwaa_connection.py Step four: Add the variables in Secrets Manager The following section describes how to create the secret for a variable in Secrets Manager. To create the secret 1. Open the AWS Secrets Manager console. 2. Choose Store a new secret. 3. Choose Other type of secret. 4. On the Specify the key/value pairs to be stored in this secret pane, choose Plaintext. 5. Add the variable value as Plaintext in the following format. "YOUR_VARIABLE_VALUE" For example, to specify an integer: 14 For example, to specify a string: "mystring" 6. 7. For Encryption key, choose an AWS KMS key
amazon-mwaa-user-guide-041
amazon-mwaa-user-guide.pdf
41
python3 mwaa_connection.py Step four: Add the variables in Secrets Manager The following section describes how to create the secret for a variable in Secrets Manager. To create the secret 1. Open the AWS Secrets Manager console. 2. Choose Store a new secret. 3. Choose Other type of secret. 4. On the Specify the key/value pairs to be stored in this secret pane, choose Plaintext. 5. Add the variable value as Plaintext in the following format. "YOUR_VARIABLE_VALUE" For example, to specify an integer: 14 For example, to specify a string: "mystring" 6. 7. For Encryption key, choose an AWS KMS key option from the dropdown list. Enter a name in the text field for Secret name in the following format. Step four: Add the variables in Secrets Manager 140 Amazon Managed Workflows for Apache Airflow User Guide airflow/variables/YOUR_VARIABLE_NAME For example: airflow/variables/test-variable 8. Choose Next. 9. On the Configure secret page, on the Secret name and description pane, do the following. a. b. For Secret name, provide a name for your secret. (Optional) For Description, provide a description for your secret. Choose Next. 10. On the Configure rotation - optional leave the default options and choose Next. 11. Repeat these steps in Secrets Manager for any additional variables you want to add. 12. On the Review page, review your secret, then choose Store. Step five: Add the connection in Secrets Manager The following section describes how to create the secret for your connection string URI in Secrets Manager. To create the secret 1. Open the AWS Secrets Manager console. 2. Choose Store a new secret. 3. Choose Other type of secret. 4. On the Specify the key/value pairs to be stored in this secret pane, choose Plaintext. 5. Add the connection URI string as Plaintext in the following format. YOUR_CONNECTION_URI_STRING For example: Step five: Add the connection in Secrets Manager 141 Amazon Managed Workflows for Apache Airflow User Guide mysql://288888a0-50a0-888-9a88-1a111aaa0000.a1.us-east-1.airflow.amazonaws.com %2Fhome?role_arn=arn%3Aaws%3Aiam%3A%3A001122332255%3Arole%2Fservice-role %2FAmazonMWAA-MyAirflowEnvironment-iAaaaA&region_name=us-east-1 Warning Apache Airflow parses each of the values in the connection string. You must not use single nor double quotes, or it will parse the connection as a single string. 6. 7. For Encryption key, choose an AWS KMS key option from the dropdown list. Enter a name in the text field for Secret name in the following format. airflow/connections/YOUR_CONNECTION_NAME For example: airflow/connections/myconn 8. Choose Next. 9. On the Configure secret page, on the Secret name and description pane, do the following. a. b. For Secret name, provide a name for your secret. (Optional) For Description, provide a description for your secret. Choose Next. 10. On the Configure rotation - optional leave the default options and choose Next. 11. Repeat these steps in Secrets Manager for any additional variables you want to add. 12. On the Review page, review your secret, then choose Store. Sample code • Learn how to use the secret key for the Apache Airflow connection (myconn) on this page using the sample code at Using a secret key in AWS Secrets Manager for an Apache Airflow connection. Sample code 142 Amazon Managed Workflows for Apache Airflow User Guide • Learn how to use the secret key for the Apache Airflow variable (test-variable) on this page using the sample code at Using a secret key in AWS Secrets Manager for an Apache Airflow variable. Resources • For more information about configuring Secrets Manager secrets using the console and the AWS CLI, see Create a secret in the AWS Secrets Manager User Guide. • Use a Python script to migrate a large volume of Apache Airflow variables and connections to Secrets Manager in Move your Apache Airflow connections and variables to AWS Secrets Manager. What's next? • Learn how to generate a token to access the Apache Airflow UI in Accessing Apache Airflow. Resources 143 Amazon Managed Workflows for Apache Airflow User Guide Managing Amazon MWAA environments The Amazon Managed Workflows for Apache Airflow console contains built-in options to configure private or public access to the Apache Airflow UI. It also contains built-in options to configure the environment size, when to scale workers, and Apache Airflow configuration options that allow you to override Apache Airflow configurations that are normally only accessible in airflow.cfg. This chapter describes how to use these configurations on the Amazon MWAA console. Topics • Configuring the Amazon MWAA environment class • Configuring Amazon MWAA worker automatic scaling • Configuring Amazon MWAA web server automatic scaling • Using Apache Airflow configuration options on Amazon MWAA • Upgrading the Apache Airflow version • Using a startup script with Amazon MWAA Configuring the Amazon MWAA environment class The environment class you choose for your Amazon MWAA environment determines the size of the AWS-managed AWS Fargate containers where the Celery Executor runs, and the AWS-managed Amazon Aurora PostgreSQL metadata database where the Apache Airflow schedulers creates task instances.
amazon-mwaa-user-guide-042
amazon-mwaa-user-guide.pdf
42
use these configurations on the Amazon MWAA console. Topics • Configuring the Amazon MWAA environment class • Configuring Amazon MWAA worker automatic scaling • Configuring Amazon MWAA web server automatic scaling • Using Apache Airflow configuration options on Amazon MWAA • Upgrading the Apache Airflow version • Using a startup script with Amazon MWAA Configuring the Amazon MWAA environment class The environment class you choose for your Amazon MWAA environment determines the size of the AWS-managed AWS Fargate containers where the Celery Executor runs, and the AWS-managed Amazon Aurora PostgreSQL metadata database where the Apache Airflow schedulers creates task instances. This topic describes each Amazon MWAA environment class, and how to update the environment class on the Amazon MWAA console. Sections • Environment capabilities • Apache Airflow Schedulers Environment capabilities The following section contains the default concurrent Apache Airflow tasks, Random Access Memory (RAM), and the virtual centralized processing units (vCPUs) for each environment class. The concurrent tasks listed assume that task concurrency does not exceed the Apache Airflow Worker capacity in the environment. Configuring the environment class 144 Amazon Managed Workflows for Apache Airflow User Guide In the following table, DAG capacity refers to DAG definitions, not executions, and assumes that your DAGs are dynamic in a single Python file and written with Apache Airflow best practices. Task executions depend by how many are scheduled simultaneously, and assumes that the number of DAG runs set to start at the same time does not exceed the default max_dagruns_per_loop_to_schedule, as well as the size and number of workers as detailed in this topic. mw1.micro • Up to 25 DAG capacity • 3 concurrent tasks (by default) • Components: • Web server: 1 vCPU, 3GB RAM • Worker and scheduler: 1 vCPU, 3GB RAM • Database: 2 vCPU, 4GB RAM Note mw1.micro does not support auto-scaling. mw1.small • Up to 50 DAG capacity • 5 concurrent tasks (by default) • Components: • Web servers: 1 vCPU, 2GB RAM each • Workers: 1 vCPU, 2GB RAM each • Schedulers: 1 vCPU, 2GB RAM each • Database: 2 vCPU, 4GB RAM mw1.medium • Up to 250 DAG capacity • 10 concurrent tasks (by default) Environment capabilities 145 Amazon Managed Workflows for Apache Airflow User Guide • Components: • Web servers: 1 vCPU 2GB RAM each • Workers: 2 vCPU 4GB RAM each • Schedulers: 2 vCPU 4GB RAM each • Database: 2 vCPU 8GB RAM mw1.large • Up to 1000 DAG capacity • 20 concurrent tasks (by default) • Components: • Web servers: 2 vCPU 4GB RAM each • Workers: 4 vCPU 8GB RAM each • Schedulers: 4 vCPU 8GB RAM each • Database: 2 vCPU 8GB RAM mw1.xlarge • Up to 2000 DAG capacity • 40 concurrent tasks (by default) • Components: • Web servers: 4 vCPU 12GB RAM each • Workers: 8 vCPU 24GB RAM each • Schedulers: 8 vCPU 24GB RAM each • Database: 4 vCPU 32GB RAM mw1.2xlarge • Up to 4000 DAG capacity • 80 concurrent tasks (by default) • Componenets: • Web servers: 8 vCPU 24GB RAM each Environment capabilities 146 Amazon Managed Workflows for Apache Airflow User Guide • Workers: 16 vCPU 48GB RAM each • Schedulers: 16 vCPU 48GB RAM each • Database: 8 vCPU 64GB RAM You can use celery.worker_autoscale to increase tasks per worker. For more information, see the the section called “Example high performance use case”. Apache Airflow Schedulers The following section contains the Apache Airflow scheduler options available on the Amazon MWAA, and how the number of schedulers affects the number of triggerers. In Apache Airflow, a triggerer manages tasks which it defers until certain conditions specified using a trigger have been met. In Amazon MWAA the triggerer runs alongside the scheduler on the same Fargate task. Increasing the scheduler count correspondingly increases the number of available triggerers, optimizing how the environment manages deferred tasks. This ensures efficient handling of tasks, promptly scheduling them to run when conditions are satisfied. Apache Airflow v2 • v2 - For environments larger than mw1.micro, accepts values from 2 to 5. Defaults to 2 for all environment sizes except mw1.micro, which defaults to 1. Configuring Amazon MWAA worker automatic scaling The auto scaling mechanism automatically increases the number of Apache Airflow workers in response to running and queued tasks on your Amazon Managed Workflows for Apache Airflow environment and disposes of extra workers when there are no more tasks queued or executing. This topic describes how you can configure auto scaling by specifying the maximum number of Apache Airflow workers that run on your environment using the Amazon MWAA console. Note Amazon MWAA uses Apache Airflow metrics to determine when additional Celery Executor workers are needed, and as required increases the number of Fargate workers up to the value specified by max-workers. As the additional workers
amazon-mwaa-user-guide-043
amazon-mwaa-user-guide.pdf
43
the number of Apache Airflow workers in response to running and queued tasks on your Amazon Managed Workflows for Apache Airflow environment and disposes of extra workers when there are no more tasks queued or executing. This topic describes how you can configure auto scaling by specifying the maximum number of Apache Airflow workers that run on your environment using the Amazon MWAA console. Note Amazon MWAA uses Apache Airflow metrics to determine when additional Celery Executor workers are needed, and as required increases the number of Fargate workers up to the value specified by max-workers. As the additional workers complete work and work load Apache Airflow Schedulers 147 Amazon Managed Workflows for Apache Airflow User Guide decreases, Amazon MWAA removes them, thus downscaling back to the value set by min- workers. If workers pick up new tasks while downscaling, Amazon MWAA keeps the Fargate resource and does not remove the worker. For more information, see How Amazon MWAA auto scaling works. Sections • How worker scaling works • Using the Amazon MWAA console • Example high performance use case • Troubleshooting tasks stuck in the running state • What's next? How worker scaling works Amazon MWAA uses RunningTasks and QueuedTasks metrics, where (tasks running + tasks queued) / (tasks per worker) = (required workers). If the required number of workers is greater than the current number of workers, Amazon MWAA will add Fargate worker containers to that value, up to the maximum value specified by max-workers. As the workload decreases and the RunningTasks and QueuedTasks metric sum reduces, Amazon MWAA requests Fargate to scale down the workers for the environment. Any workers which still completing work remain protected during downscaling until they complete their work. Depending on the workload, tasks may be queued while workers downscale. Using the Amazon MWAA console You can choose the maximum number of workers that can run on your environment concurrently on the Amazon MWAA console. By default, you can specify a maximum value up to 25. To configure the number of workers 1. Open the Environments page on the Amazon MWAA console. 2. Choose an environment. 3. Choose Edit. How worker scaling works 148 Amazon Managed Workflows for Apache Airflow User Guide 4. Choose Next. 5. On the Environment class pane, enter a value in Maximum worker count. 6. Choose Save. Note It can take a few minutes before changes take effect on your environment. Example high performance use case The following section describes the type of configurations you can use to enable high performance and parallelism on an environment. On-premise Apache Airflow Typically, in an on-premise Apache Airflow platform, you would configure task parallelism, auto scaling, and concurrency settings in your airflow.cfg file: • core.parallelism – The maximum number of task instances that can run simultaneously per scheduler. • core.dag_concurrency – The maximum concurrency for DAGs (not workers). • celery.worker_autoscale – The maximum and minimum number of tasks that can run concurrently on any worker. For example, if core.parallelism was set to 100 and core.dag_concurrency was set to 7, you would still only be able to run a total of 14 tasks concurrently if you had 2 DAGs. Given, each DAG is set to run only seven tasks concurrently (in core.dag_concurrency), even though overall parallelism is set to 100 (in core.parallelism). On an Amazon MWAA environment On an Amazon MWAA environment, you can configure these settings directly on the Amazon MWAA console using Using Apache Airflow configuration options on Amazon MWAA, Configuring the Amazon MWAA environment class, and the Maximum worker count auto scaling mechanism. While core.dag_concurrency is not available in the drop down list as an Apache Airflow Example high performance use case 149 Amazon Managed Workflows for Apache Airflow User Guide configuration option on the Amazon MWAA console, you can add it as a custom Apache Airflow configuration option. Let's say, when you created your environment, you chose the following settings: 1. The mw1.small environment class which controls the maximum number of concurrent tasks each worker can run by default and the vCPU of containers. 2. The default setting of 10 Workers in Maximum worker count. 3. An Apache Airflow configuration option for celery.worker_autoscale of 5,5 tasks per worker. This means you can run 50 concurrent tasks in your environment. Any tasks beyond 50 will be queued, and wait for the running tasks to complete. Run more concurrent tasks. You can modify your environment to run more tasks concurrently using the following configurations: 1. Increase the maximum number of concurrent tasks each worker can run by default and the vCPU of containers by choosing the mw1.medium (10 concurrent tasks by default) environment class. 2. Add celery.worker_autoscale as an Apache Airflow configuration option. 3. Increase the Maximum worker count. In this example, increasing maximum workers from 10 to 20 would double
amazon-mwaa-user-guide-044
amazon-mwaa-user-guide.pdf
44
means you can run 50 concurrent tasks in your environment. Any tasks beyond 50 will be queued, and wait for the running tasks to complete. Run more concurrent tasks. You can modify your environment to run more tasks concurrently using the following configurations: 1. Increase the maximum number of concurrent tasks each worker can run by default and the vCPU of containers by choosing the mw1.medium (10 concurrent tasks by default) environment class. 2. Add celery.worker_autoscale as an Apache Airflow configuration option. 3. Increase the Maximum worker count. In this example, increasing maximum workers from 10 to 20 would double the number of concurrent tasks the environment can run. Specify Minimum workers. You can also specify the minimum and maximum number of Apache Airflow Workers that run in your environment using the AWS Command Line Interface (AWS CLI). For example: aws mwaa update-environment --max-workers 10 --min-workers 10 -- name YOUR_ENVIRONMENT_NAME To learn more, see the update-environment command in the AWS CLI. Troubleshooting tasks stuck in the running state In rare cases, Apache Airflow may think there are tasks still running. To resolve this issue, you need to clear the stranded task in your Apache Airflow UI. For more information, see the I see my tasks stuck or not completing troubleshooting topic. Troubleshooting tasks stuck in the running state 150 Amazon Managed Workflows for Apache Airflow User Guide What's next? • Learn more about the best practices we recommend to tune the performance of your environment in Performance tuning for Apache Airflow on Amazon MWAA. Configuring Amazon MWAA web server automatic scaling For environments running Apache Airflow v2.2.2 and above, Amazon MWAA dynamically scales your web servers to handle fluctuating workloads, which in turn prevents performance issues during peak loads. By automatically scaling the number of web servers based on CPU utilization and active connection count, Amazon MWAA ensures that your Apache Airflow environment can seamlessly accommodate increased demand, whether from REST API requests, CLI usage, or more concurrent Apache Airflow user interface users. Sections • How web server scaling works • Using the Amazon MWAA console How web server scaling works Amazon MWAA uses the container metric, CPUUtilization, and the load balancer metric, ActiveConnectionCount, to determine if scaling the web servers is required based on the amount of traffic. If CPUUtilization is higher than 70 or ActiveConnectionCount is higher than 15, Amazon MWAA will add additional Fargate web server containers up to the maximum value specified by MaxWebservers. As traffic decreases and the CPUUtilization and ActiveConnectionCount values reduce, Amazon MWAA requests Fargate to scale down the web server containers for the environment to the minimum value set by MinimumWebservers. Using the Amazon MWAA console You can choose the number of web servers that can run on your environment concurrently on the Amazon MWAA console. By default, the minimum number of web servers is two, and the maximum number of web servers is five. What's next? 151 Amazon Managed Workflows for Apache Airflow User Guide To configure the number of web servers 1. Open the Environments page on the Amazon MWAA console. 2. Choose an environment. 3. Choose Edit. 4. Choose Next. 5. On the Environment class pane, enter a value in Maximum web server count. 6. Next, enter a value in Minimum web server count. 7. Choose Save. Note It can take a few minutes before changes take effect on your environment. Using Apache Airflow configuration options on Amazon MWAA Apache Airflow configuration options can be attached to your Amazon Managed Workflows for Apache Airflow environment as environment variables. You can choose from the suggested dropdown list, or specify custom configuration options for your Apache Airflow version on the Amazon MWAA console. This topic describes the Apache Airflow configuration options available, and how to use these options to override Apache Airflow configuration settings on your environment. Contents • Prerequisites • How it works • Using configuration options to load plugins in Apache Airflow v2 • Configuration options overview • Apache Airflow configuration options • Apache Airflow reference • Using the Amazon MWAA console • Configuration reference • Email configurations Using configuration options 152 Amazon Managed Workflows for Apache Airflow User Guide • Task configurations • Scheduler configurations • Worker configurations • Web server configurations • Triggerer configurations • Examples and sample code • Example DAG • Example email notification settings • What's next? Prerequisites You'll need the following before you can complete the steps on this page. • Permissions — Your AWS account must have been granted access by your administrator to the AmazonMWAAFullConsoleAccess access control policy for your environment. In addition, your Amazon MWAA environment must be permitted by your execution role to access the AWS resources used by your environment. • Access — If you require access to public repositories to install dependencies directly on the web server,
amazon-mwaa-user-guide-045
amazon-mwaa-user-guide.pdf
45
• Web server configurations • Triggerer configurations • Examples and sample code • Example DAG • Example email notification settings • What's next? Prerequisites You'll need the following before you can complete the steps on this page. • Permissions — Your AWS account must have been granted access by your administrator to the AmazonMWAAFullConsoleAccess access control policy for your environment. In addition, your Amazon MWAA environment must be permitted by your execution role to access the AWS resources used by your environment. • Access — If you require access to public repositories to install dependencies directly on the web server, your environment must be configured with public network web server access. For more information, see the section called “Apache Airflow access modes”. • Amazon S3 configuration — The Amazon S3 bucket used to store your DAGs, custom plugins in plugins.zip, and Python dependencies in requirements.txt must be configured with Public Access Blocked and Versioning Enabled. How it works When you create an environment, Amazon MWAA attaches the configuration settings you specify on the Amazon MWAA console in Airflow configuration options as environment variables to the AWS Fargate container for your environment. If you're using a setting of the same name in airflow.cfg, the options you specify on the Amazon MWAA console override the values in airflow.cfg. While we don't expose the airflow.cfg in the Apache Airflow UI of an Amazon MWAA environment by default, you can change the Apache Airflow configuration options directly on Prerequisites 153 Amazon Managed Workflows for Apache Airflow User Guide the Amazon MWAA console, including setting webserver.expose_config to expose the configurations. Using configuration options to load plugins in Apache Airflow v2 By default in Apache Airflow v2, plugins are configured to be "lazily" loaded using the core.lazy_load_plugins : True setting. If you're using custom plugins in Apache Airflow v2, you must add core.lazy_load_plugins : False as an Apache Airflow configuration option to load plugins at the start of each Airflow process to override the default setting. Configuration options overview When you add a configuration on the Amazon MWAA console, Amazon MWAA writes the configuration as an environment variable. • Listed options. You can choose from one of the configuration settings available for your Apache Airflow version in the dropdown list. For example, dag_concurrency : 16. The configuration setting is translated to your environment's Fargate container as AIRFLOW__CORE__DAG_CONCURRENCY : 16 • Custom options. You can also specify Airflow configuration options that are not listed for your Apache Airflow version in the dropdown list. For example, foo.user : YOUR_USER_NAME. The configuration setting is translated to your environment's Fargate container as AIRFLOW__FOO__USER : YOUR_USER_NAME Apache Airflow configuration options The following image shows where you can customize the Apache Airflow configuration options on the Amazon MWAA console. Using configuration options to load plugins in Apache Airflow v2 154 Amazon Managed Workflows for Apache Airflow Apache Airflow reference User Guide For a list of configuration options supported by Apache Airflow, see Configuration Reference in the Apache Airflow reference guide. To view the options for the version of Apache Airflow you are running on Amazon MWAA, select the version from the drop down list. Using the Amazon MWAA console The following procedure walks you through the steps of adding an Airflow configuration option to your environment. 1. Open the Environments page on the Amazon MWAA console. 2. Choose an environment. 3. Choose Edit. 4. Choose Next. 5. Choose Add custom configuration in the Airflow configuration options pane. 6. Choose a configuration from the dropdown list and enter a value, or type a custom configuration and enter a value. 7. Choose Add custom configuration for each configuration you want to add. 8. Choose Save. Configuration reference The following section contains the list of available Apache Airflow configurations in the dropdown list on the Amazon MWAA console. Email configurations The following list shows the Airflow email notification configuration options available on Amazon MWAA. We recommend using port 587 for SMTP traffic. By default, AWS blocks outbound SMTP traffic on port 25 of all Amazon EC2 instances. If you want to send outbound traffic on port 25, you can request for this restriction to be removed. Configuration reference 155 Amazon Managed Workflows for Apache Airflow User Guide Apache Airflow v2 Airflow version v2 v2 v2 v2 v2 v2 Airflow configura tion option email.email_backend smtp.smtp_host smtp.smtp_starttls Description Example value The Apache Airflow utility used for email airflow.utils.emai l.send_email_smtp notifications in email_backend. The name of the outbound server used for the email address in smtp_host. localhost False Transport Layer Security (TLS) is used to encrypt the email over the Internet in smtp_star ttls. smtp.smtp_ssl Secure Sockets Layer (SSL) is used to True smtp.smtp_port connect the server and email client in smtp_ssl. The Transmission Control Protocol (TCP) port designate d to the server in smtp_port. 587 smtp.smtp_mail_fro m
amazon-mwaa-user-guide-046
amazon-mwaa-user-guide.pdf
46
User Guide Apache Airflow v2 Airflow version v2 v2 v2 v2 v2 v2 Airflow configura tion option email.email_backend smtp.smtp_host smtp.smtp_starttls Description Example value The Apache Airflow utility used for email airflow.utils.emai l.send_email_smtp notifications in email_backend. The name of the outbound server used for the email address in smtp_host. localhost False Transport Layer Security (TLS) is used to encrypt the email over the Internet in smtp_star ttls. smtp.smtp_ssl Secure Sockets Layer (SSL) is used to True smtp.smtp_port connect the server and email client in smtp_ssl. The Transmission Control Protocol (TCP) port designate d to the server in smtp_port. 587 smtp.smtp_mail_fro m The outbound email address in myemail@d omain.com smtp_mail_from. Configuration reference 156 Amazon Managed Workflows for Apache Airflow User Guide Task configurations The following list shows the configurations available in the dropdown list for Airflow tasks on Amazon MWAA. Apache Airflow v2 Airflow version Airflow configura tion option Description Example value v2 v2 core.default_task_ retries The number of times to retry an 3 core.parallelism Apache Airflow task in default_task_retri es. The maximum number of task instances that can run simultaneously across the entire environment in parallel (parallelism). 40 Scheduler configurations The following list shows the Apache Airflow scheduler configurations available in the dropdown list on Amazon MWAA. Apache Airflow v2 Airflow version Airflow configura tion option Description Example value v2 scheduler.catchup_ by_default Tells the scheduler to create a DAG False run to "catch up" Configuration reference 157 Amazon Managed Workflows for Apache Airflow User Guide Airflow version Airflow configura tion option Description Example value to the specific time interval in catchup_b y_default. v2 scheduler.schedule r_zombie_task_thre Tells the scheduler whether to mark 300 shold the task instance as failed and reschedule the task in scheduler _zombie_task_thres hold. Worker configurations The following list shows the Airflow worker configurations available in the dropdown list on Amazon MWAA. Apache Airflow v2 Airflow version v2 Airflow configura tion option celery.worker_auto scale Description Example value 16,12 The maximum and minimum number of tasks that can run concurrently on any worker using the Celery Executor in worker_autoscale. Value must be comma-separated in the following order: max_concu Configuration reference 158 Amazon Managed Workflows for Apache Airflow User Guide Airflow version Airflow configura tion option Description Example value rrency,mi n_concurrency . Web server configurations The following list shows the Airflow web server configurations available in the dropdown list on Amazon MWAA. Apache Airflow v2 Airflow version v2 Airflow configura tion option webserver.default_ ui_timezone Description Example value America/New_York The default Apache Airflow UI datetime setting in default_u i_timezone. Note Setting the default_u i_timezon e option does not change the time zone in which your DAGs are scheduled to run. To change the time zone for your DAGs, Configuration reference 159 Amazon Managed Workflows for Apache Airflow User Guide Airflow version Airflow configura tion option Description Example value you can use a custom plugin. For more informati on, see the section called “Changing a DAG's timezone”. Triggerer configurations The following list shows the Apache Airflow triggerer configurations available on Amazon MWAA. Apache Airflow v2 Airflow version v2.7 Airflow configura tion option mwaa.triggerer_ena bled Description Example value True Used for activatin g and deactivating the triggerer on Amazon MWAA. By default, this value is set to True. If set to False, Amazon MWAA will not start any triggerer processes on schedulers. Configuration reference 160 Amazon Managed Workflows for Apache Airflow User Guide Airflow version v2.7 Airflow configura tion option triggerer.default_ capacity Description Example value 125 Defines the number triggers each triggerer can run in parallel. On Amazon MWAA, this capacity is set per each triggerer and per each scheduler as both components run alongside each other. The default per scheduler is set to 60, 125, 250, 500, and 1000 for small, medium and large, xlarge, and 2xlarge instances, respectively. Examples and sample code Example DAG You can use the following DAG to print your email_backend Apache Airflow configuration options. To run in response to Amazon MWAA events, copy the code to your environment's DAGs folder on your Amazon S3 storage bucket. from airflow.decorators import dag from datetime import datetime def print_var(**kwargs): email_backend = kwargs['conf'].get(section='email', key='email_backend') print("email_backend") return email_backend Examples and sample code 161 Amazon Managed Workflows for Apache Airflow User Guide @dag( dag_id="print_env_variable_example", schedule_interval=None, start_date=datetime(yyyy, m, d), catchup=False, ) def print_variable_dag(): email_backend_test = PythonOperator( task_id="email_backend_test", python_callable=print_var, provide_context=True ) print_variable_test = print_variable_dag() Example email notification settings The following Apache Airflow configuration options can be used for a Gmail.com email account using an app password. For more information, see Sign in using app passwords in the Gmail Help reference guide. Examples and sample code 162 Amazon Managed Workflows for Apache Airflow User Guide What's next? • Learn how to upload your DAG folder to your Amazon S3 bucket in Adding or updating DAGs. Upgrading the Apache Airflow version Amazon MWAA supports minor version upgrades. This
amazon-mwaa-user-guide-047
amazon-mwaa-user-guide.pdf
47
schedule_interval=None, start_date=datetime(yyyy, m, d), catchup=False, ) def print_variable_dag(): email_backend_test = PythonOperator( task_id="email_backend_test", python_callable=print_var, provide_context=True ) print_variable_test = print_variable_dag() Example email notification settings The following Apache Airflow configuration options can be used for a Gmail.com email account using an app password. For more information, see Sign in using app passwords in the Gmail Help reference guide. Examples and sample code 162 Amazon Managed Workflows for Apache Airflow User Guide What's next? • Learn how to upload your DAG folder to your Amazon S3 bucket in Adding or updating DAGs. Upgrading the Apache Airflow version Amazon MWAA supports minor version upgrades. This means you can upgrade your environment from version x.4.z to x.5.z. To perform a major version upgrade, for example from version 1.y.z to 2.y.z, you must create a new environment and migrate your resources. For more information on upgrading to a new major version of Apache Airflow, see Migrating to a new Amazon MWAA environment in the Amazon MWAA Migration Guide. During the upgrade process, Amazon MWAA captures a snapshot of your environment metadata, upgrades the workers, schedulers, the web server to the new Apache Airflow version, and finally restores the metadata database using the snapshot. What's next? 163 Amazon Managed Workflows for Apache Airflow User Guide Note You cannot downgrade the Apache Airflow version for your environment. Before you upgrade, make sure that your DAGs and other workflow resources are compatible with the new Apache Airflow version you are upgrading to. If you use a requirements.txt to manage dependencies, you must also ensure the dependencies you specify in your requirements are compatible with the new version. Topics • Upgrade your workflow resources • Specify the new version Upgrade your workflow resources Whenever you're changing Apache Airflow versions, ensure that you reference the correct -- constraint URL in your requirements.txt. Warning Specifying requirements that are incompatible with your target Apache Airflow version during an upgrade might result in a lengthy rollback process to the previous version of Apache Airflow with the previous requirements version. To migrate your workflow resources 1. Create a fork of the aws-mwaa-local-runner repository, and clone a copy of the Amazon MWAA local runner. 2. Checkout to the branch of the aws-mwaa-local-runner repository that matches the version you are upgrading to. 3. Use the Amazon MWAA local runner CLI tool to build the Docker image and run Apache Airflow locally. For more information, see the local runner README in the GitHub repository. 4. To update your requirements.txt, follow the best practices we recommend in Managing Python dependencies, in the Amazon MWAA User Guide. Upgrade your workflow resources 164 Amazon Managed Workflows for Apache Airflow User Guide 5. (Optional) To speed up the upgrade process, clean up the environment's metadata database. Environments with a large amount of metadata can take significantly longer to upgrade. 6. After you have successfully tested your workflow resources, copy your DAGs, requirements.txt, and plugins to your environment's Amazon S3 bucket. You are now ready to edit the environment, specify a new Apache Airflow version, and start the update procedure. Specify the new version After you have completed updating your workflow resources to ensure compatibility with the new Apache Airflow version, do the following to edit environment details and specify the version of Apache Airflow that you want to upgrade to. Note When you perform an upgrade, all tasks currently running on the environment are terminated during the procedure. The update procedure can take up to two hours, during which time your environment will be unavailable. To specify a new version using the console 1. Open the Environments page on the Amazon MWAA console. 2. From the Environments list, choose the environment that you want to upgrade. 3. On the environment page, choose Edit to edit the environment. 4. In the Environment details section, for Airflow version, choose the new Apache Airflow version number that you want to upgrade the environment to from the dropdown list. 5. Choose Next until you are on the Review and save page. 6. On the Review and save page, review your changes, then choose Save. When you apply changes, your environment begins the upgrade procedure. During this period, the status of your environment indicates what actions Amazon MWAA is taking, and whether the procedure is successful. Specify the new version 165 Amazon Managed Workflows for Apache Airflow User Guide In a successful upgrade scenario, the status will show UPDATING, then CREATING_SNAPSHOT as Amazon MWAA captures a backup of your metadata. Finally, the status will return first to UPDATING, then to AVAILABLE when the procedure is done. If the environment fails to upgrade, your environment status will show ROLLING_BACK. If the rollback is successful, the status will first show UPDATE_FAILED, indicating that the update failed but the environment is available. If the rollback fails, the status will show UNAVAILABLE,
amazon-mwaa-user-guide-048
amazon-mwaa-user-guide.pdf
48
is taking, and whether the procedure is successful. Specify the new version 165 Amazon Managed Workflows for Apache Airflow User Guide In a successful upgrade scenario, the status will show UPDATING, then CREATING_SNAPSHOT as Amazon MWAA captures a backup of your metadata. Finally, the status will return first to UPDATING, then to AVAILABLE when the procedure is done. If the environment fails to upgrade, your environment status will show ROLLING_BACK. If the rollback is successful, the status will first show UPDATE_FAILED, indicating that the update failed but the environment is available. If the rollback fails, the status will show UNAVAILABLE, indicating that you cannot access the environment. Using a startup script with Amazon MWAA A startup script is a shell (.sh) script that you host in your environment's Amazon S3 bucket similar to your DAGs, requirements, and plugins. Amazon MWAA runs this script during startup on every individual Apache Airflow component (worker, scheduler, and web server) before installing requirements and initializing the Apache Airflow process. Use a startup script to do the following: • Install runtimes – Install Linux runtimes required by your workflows and connections. • Configure environment variables – Set environment variables for each Apache Airflow component. Overwrite common variables such as PATH, PYTHONPATH, and LD_LIBRARY_PATH. • Manage keys and tokens – Pass access tokens for custom repositories to requirements.txt and configure security keys. The following topics describe how to configure a startup script to install Linux runtimes, set environment variables, and troubleshoot related issues using CloudWatch Logs. Topics • Configure a startup script • Install Linux runtimes using a startup script • Set environment variables using a startup script Configure a startup script To use a startup script with your existing Amazon MWAA environment, upload a .sh file to your environment's Amazon S3 bucket. Then, to associate the script with the environment, specify the following in your environment details: Using a startup script 166 Amazon Managed Workflows for Apache Airflow User Guide • The Amazon S3 URL path to the script – The relative path to the script hosted in your bucket, for example, s3://mwaa-environment/startup.sh • The Amazon S3 version ID of the script – The version of the startup shell script in your Amazon S3 bucket. You must specify the version ID that Amazon S3 assigns to the file every time you update the script. Version IDs are Unicode, UTF-8 encoded, URL-ready, opaque strings that are no more than 1,024 bytes long, for example, 3sL4kqtJlcpXroDTDmJ+rmSpXd3dIbrHY +MTRCxf3vjVBH40Nr8X8gdRQBpUMLUo. To complete the steps in this section, use the following sample script. The script outputs the value assigned to MWAA_AIRFLOW_COMPONENT. This environment variable identifies each Apache Airflow component that the script runs on. Copy the code and save it locally as startup.sh. #!/bin/sh echo "Printing Apache Airflow component" echo $MWAA_AIRFLOW_COMPONENT Next, upload the script to your Amazon S3 bucket. AWS Management Console To upload a shell script (console) 1. Sign in to the AWS Management Console and open the Amazon S3 console at https:// console.aws.amazon.com/s3/. 2. From the Buckets list, choose the name of the bucket associated with your environment. 3. On the Objects tab, choose Upload. 4. On the Upload page, drag and drop the shell script you created. 5. Choose Upload. The script appears in the list of Objects. Amazon S3 creates a new version ID for the file. If you update the script and upload it again using the same file name, a new version ID is assigned to the file. Configure a startup script 167 Amazon Managed Workflows for Apache Airflow User Guide AWS CLI To create and upload a shell script (CLI) 1. Open a new command prompt, and run the Amazon S3 ls command to list and identify the bucket associated with your environment. $ aws s3 ls 2. Navigate to the folder where you saved the shell script. Use cp in a new prompt window to upload the script to your bucket. Replace your-s3-bucket with your information. $ aws s3 cp startup.sh s3://your-s3-bucket/startup.sh If successful, Amazon S3 outputs the URL path to the object: upload: ./startup.sh to s3://your-s3-bucket/startup.sh 3. Use the following command to retrieve the latest version ID for the script. $ aws s3api list-object-versions --bucket your-s3-bucket --prefix startup -- query 'Versions[?IsLatest].[VersionId]' --output text BbdVMmBRjtestta1EsVnbybZp1Wqh1J4 You specify this version ID when you associate the script with an environment. Now, associate the script with your environment. AWS Management Console To associate the script with an environment (console) 1. Open the Environments page on the Amazon MWAA console. 2. Select the row for the environment you want to update, then choose Edit. 3. On the Specify details page, for Startup script file - optional, enter the Amazon S3 URL for the script, for example: s3://your-mwaa-bucket/startup-sh.. Configure a startup script 168 Amazon Managed Workflows for Apache Airflow User Guide 4. Choose the latest version from
amazon-mwaa-user-guide-049
amazon-mwaa-user-guide.pdf
49
--output text BbdVMmBRjtestta1EsVnbybZp1Wqh1J4 You specify this version ID when you associate the script with an environment. Now, associate the script with your environment. AWS Management Console To associate the script with an environment (console) 1. Open the Environments page on the Amazon MWAA console. 2. Select the row for the environment you want to update, then choose Edit. 3. On the Specify details page, for Startup script file - optional, enter the Amazon S3 URL for the script, for example: s3://your-mwaa-bucket/startup-sh.. Configure a startup script 168 Amazon Managed Workflows for Apache Airflow User Guide 4. Choose the latest version from the drop down list, or Browse S3 to find the script. 5. Choose Next, then proceed to the Review and save page. 6. Review changes, then choose Save. Environment updates can take between 10 to 30 minutes. Amazon MWAA runs the startup script as each component in your environment restarts. AWS CLI To associate the script with an environment (CLI) • Open a command prompt and use update-environment to specify the Amazon S3 URL and version ID for the script. $ aws mwaa update-environment \ --name your-mwaa-environment \ --startup-script-s3-path startup.sh \ --startup-script-s3-object-version BbdVMmBRjtestta1EsVnbybZp1Wqh1J4 If successful, Amazon MWAA returns the Amazon Resource Name (ARN) for the environment: arn:aws::airflow:us-west-2:123456789012:environment/your-mwaa-environment Environment update can take between 10 to 30 minutes. Amazon MWAA runs the startup script as each component in your environment restarts. Finally, retrieve log events to verify that the script is working as expected. When you activate logging for an each Apache Airflow component, Amazon MWAA creates a new log group and log stream. For more information, see Apache Airflow log types. AWS Management Console To check the Apache Airflow log stream (console) 1. Open the Environments page on the Amazon MWAA console. 2. Choose your environment. Configure a startup script 169 Amazon Managed Workflows for Apache Airflow User Guide 3. In the Monitoring pane, choose the log group for which you want to view logs, for example, Airflow scheduler log group . 4. In the CloudWatch console, from the Log streams list, choose a stream with the following prefix: startup_script_exection_ip. 5. On the Log events pane, you will see the output of the command printing the value for MWAA_AIRFLOW_COMPONENT. For example, for scheduler logs, you will the following: Printing Apache Airflow component scheduler Finished running startup script. Execution time: 0.004s. Running verification Verification completed You can repeat the previous steps to view worker and web server logs. Install Linux runtimes using a startup script Use a startup script to update the operating system of an Apache Airflow component, and install additional runtime libraries to use with your workflows. For example, the following script runs yum update to update the operating system. When running yum update in a startup script, you must exclude Python using -- exclude=python* as shown in the example. For your environment to run, Amazon MWAA installs a specific version of Python compatible with your environment. Therefore, you can't update the environment's Python version using a startup script. #!/bin/sh echo "Updating operating system" sudo yum update -y --exclude=python* To install runtimes on specific Apache Airflow component, use MWAA_AIRFLOW_COMPONENT and if and fi conditional statements. This example runs a single command to install the libaio library on the scheduler and worker, but not on the web server. Install Linux runtimes 170 Amazon Managed Workflows for Apache Airflow User Guide Important • If you have configured a private web server, you must either use the following condition or provide all installation files locally in order to avoid installation timeouts. • Use sudo to run operations that require administrative privileges. #!/bin/sh if [[ "${MWAA_AIRFLOW_COMPONENT}" != "webserver" ]] then sudo yum -y install libaio fi You can use a startup script to check the Python version. #!/bin/sh export PYTHON_VERSION_CHECK=`python -c 'import sys; version=sys.version_info[:3]; print("{0}.{1}.{2}".format(*version))'` echo "Python version is $PYTHON_VERSION_CHECK" Amazon MWAA does not support overriding the default Python version, as this may lead to incompatibilities with the installed Apache Airflow libraries. Set environment variables using a startup script Use startup scripts to set environment variables and modify Apache Airflow configurations. The following defines a new variable, ENVIRONMENT_STAGE. You can reference this variable in a DAG or in your custom modules. #!/bin/sh export ENVIRONMENT_STAGE="development" echo "$ENVIRONMENT_STAGE" Use startup scripts to overwrite common Apache Airflow or system variables. For example, you set LD_LIBRARY_PATH to instruct Python to look for binaries in the path you specify. This lets you provide custom binaries for your workflows using plugins: Set environment variables 171 Amazon Managed Workflows for Apache Airflow User Guide #!/bin/sh export LD_LIBRARY_PATH=/usr/local/airflow/plugins/your-custom-binary Reserved environment variables Amazon MWAA reserves a set of critical environment variables. If you overwrite a reserved variable, Amazon MWAA restores it to its default. The following lists the reserved variables: • MWAA__AIRFLOW__COMPONENT – Used to identify the Apache Airflow component with one of
amazon-mwaa-user-guide-050
amazon-mwaa-user-guide.pdf
50
echo "$ENVIRONMENT_STAGE" Use startup scripts to overwrite common Apache Airflow or system variables. For example, you set LD_LIBRARY_PATH to instruct Python to look for binaries in the path you specify. This lets you provide custom binaries for your workflows using plugins: Set environment variables 171 Amazon Managed Workflows for Apache Airflow User Guide #!/bin/sh export LD_LIBRARY_PATH=/usr/local/airflow/plugins/your-custom-binary Reserved environment variables Amazon MWAA reserves a set of critical environment variables. If you overwrite a reserved variable, Amazon MWAA restores it to its default. The following lists the reserved variables: • MWAA__AIRFLOW__COMPONENT – Used to identify the Apache Airflow component with one of the following values: scheduler, worker, or webserver. • AIRFLOW__WEBSERVER__SECRET_KEY – The secret key used for securely signing session cookies in the Apache Airflow web server. • AIRFLOW__CORE__FERNET_KEY – The key used for encryption and decryption of sensitive data stored in the metadata database, for example, connection passwords. • AIRFLOW_HOME – The path to the Apache Airflow home directory where configuration files and DAG files are stored locally. • AIRFLOW__CELERY__BROKER_URL – The URL of the message broker used for communication between the Apache Airflow scheduler and the Celery worker nodes. • AIRFLOW__CELERY__RESULT_BACKEND – The URL of the database used to store the results of Celery tasks. • AIRFLOW__CORE__EXECUTOR – The executor class that Apache Airflow should use. In Amazon MWAA this is a CeleryExecutor • AIRFLOW__CORE__LOAD_EXAMPLES – Used to activate, or deactivate, the loading of example DAGs. • AIRFLOW__METRICS__METRICS_BLOCK_LIST – Used to manage which Apache Airflow metrics are emitted and captured by Amazon MWAA in CloudWatch. • SQL_ALCHEMY_CONN – The connection string for the RDS for PostgreSQL database used to store Apache Airflow metadata in Amazon MWAA. • AIRFLOW__CORE__SQL_ALCHEMY_CONN – Used for the same purpose as SQL_ALCHEMY_CONN, but following the new Apache Airflow naming convention. • AIRFLOW__CELERY__DEFAULT_QUEUE – The default queue for Celery tasks in Apache Airflow. • AIRFLOW__OPERATORS__DEFAULT_QUEUE – The default queue for tasks using specific Apache Airflow operators. Set environment variables 172 Amazon Managed Workflows for Apache Airflow User Guide • AIRFLOW_VERSION – The Apache Airflow version installed in the Amazon MWAA environment. • AIRFLOW_CONN_AWS_DEFAULT – The default AWS credentials used to integrate with other AWS services in. • AWS_DEFAULT_REGION – Sets the default AWS Region used with default credentials to integrate with other AWS services. • AWS_REGION – If defined, this environment variable overrides the values in the environment variable AWS_DEFAULT_REGION and the profile setting region. • PYTHONUNBUFFERED – Used to send stdout and stderr streams to container logs. • AIRFLOW__METRICS__STATSD_ALLOW_LIST – Used to configure an allow list of comma- separated prefixes to send the metrics that start with the elements of the list. • AIRFLOW__METRICS__STATSD_ON – Activates sending metrics to StatsD. • AIRFLOW__METRICS__STATSD_HOST – Used to connect to the StatSD daemon. • AIRFLOW__METRICS__STATSD_PORT – Used to connect to the StatSD daemon. • AIRFLOW__METRICS__STATSD_PREFIX – Used to connect to the StatSD daemon. • AIRFLOW__CELERY__WORKER_AUTOSCALE – Sets the maximum and minimum concurrency. • AIRFLOW__CORE__DAG_CONCURRENCY – Sets the number of task instances that can run concurrently by the scheduler in one DAG. • AIRFLOW__CORE__MAX_ACTIVE_TASKS_PER_DAG – Sets the maximum number of active tasks per DAG. • AIRFLOW__CORE__PARALLELISM – Defines the maximum number of task instances that can simultaneously. • AIRFLOW__SCHEDULER__PARSING_PROCESSES – Sets the maximum number of processes parsed by the scheduler to schedule DAGs. • AIRFLOW__CELERY_BROKER_TRANSPORT_OPTIONS__VISIBILITY_TIMEOUT – Defines the number of seconds a worker waits to acknowledge the task before the message is redelivered to another worker. • AIRFLOW__CELERY_BROKER_TRANSPORT_OPTIONS__REGION – Sets the AWS Region for the underlying Celery transport. • AIRFLOW__CELERY_BROKER_TRANSPORT_OPTIONS__PREDEFINED_QUEUES – Sets the queue for the underlying Celery transport. • AIRFLOW_SCHEDULER_ALLOWED_RUN_ID_PATTERN – Used to verify the validity of your input for the run_id parameter when triggering a DAG. Set environment variables 173 Amazon Managed Workflows for Apache Airflow User Guide • AIRFLOW__WEBSERVER__BASE_URL – The URL of the web server used to host the Apache Airflow UI. Unreserved environment variables You can use a startup script to overwrite unreserved environment variables. The following lists some of these common variables: • PATH – Specifies a list of directories where the operating system searches for executable files and scripts. When a command runs in the command line, the system checks the directories in PATH in order to find and execute the command. When you create custom operators or tasks in Apache Airflow, you might need to rely on external scripts or executables. If the directories containing these files are not in the specified in the PATH variable, the tasks fail to run when the system is unable to locate them. By adding the appropriate directories to PATH, Apache Airflow tasks can find and run the required executables. • PYTHONPATH – Used by the Python interpreter to determine which directories to search for imported modules and packages. It is a list of directories that you can add to the default search path. This lets
amazon-mwaa-user-guide-051
amazon-mwaa-user-guide.pdf
51
you create custom operators or tasks in Apache Airflow, you might need to rely on external scripts or executables. If the directories containing these files are not in the specified in the PATH variable, the tasks fail to run when the system is unable to locate them. By adding the appropriate directories to PATH, Apache Airflow tasks can find and run the required executables. • PYTHONPATH – Used by the Python interpreter to determine which directories to search for imported modules and packages. It is a list of directories that you can add to the default search path. This lets the interpreter find and load Python libraries not included in the standard library, or installed in system directories. Use this variable to add your modules and custom Python packages and use them with your DAGs. • LD_LIBRARY_PATH – An environment variable used by the dynamic linker and loader in Linux to find and load shared libraries. It specifies a list of directories containing shared libraries, which are searched before the default system library directories. Use this variable to specify your custom binaries. • CLASSPATH – Used by the Java Runtime Environment (JRE) and Java Development Kit (JDK) to locate and load Java classes, libraries, and resources at runtime. It is a list of directories, JAR files, and ZIP archives that contain compiled Java code. Set environment variables 174 Amazon Managed Workflows for Apache Airflow User Guide Working with DAGs on Amazon MWAA To run Directed Acyclic Graphs (DAGs) on an Amazon Managed Workflows for Apache Airflow environment, you copy your files to the Amazon S3 storage bucket attached to your environment, then let Amazon MWAA know where your DAGs and supporting files are located on the Amazon MWAA console. Amazon MWAA takes care of synchronizing the DAGs among workers, schedulers, and the web server. This guide describes how to add or update your DAGs, and install custom plugins and Python dependencies on an Amazon MWAA environment. Topics • Amazon S3 bucket overview • Adding or updating DAGs • Installing custom plugins • Installing Python dependencies • Deleting files on Amazon S3 Amazon S3 bucket overview An Amazon S3 bucket for an Amazon MWAA environment must have Public Access Blocked. By default, all Amazon S3 resources—buckets, objects, and related sub-resources (for example, lifecycle configuration)—are private. • Only the resource owner, the AWS account that created the bucket, can access the resource. The resource owner (for example, your administrator) can grant access permissions to others by writing an access control policy. • The access policy you set up must have permission to add DAGs, custom plugins in plugins.zip, and Python dependencies in requirements.txt to your Amazon S3 bucket. For an example policy that contains the required permissions, see AmazonMWAAFullConsoleAccess. An Amazon S3 bucket for an Amazon MWAA environment must have Versioning Enabled. When Amazon S3 bucket versioning is enabled, anytime a new version is created, a new copy is created. • Versioning is enabled for the custom plugins in a plugins.zip, and Python dependencies in a requirements.txt on your Amazon S3 bucket. Amazon S3 bucket overview 175 Amazon Managed Workflows for Apache Airflow User Guide • You must specify the version of a plugins.zip, and requirements.txt on the Amazon MWAA console each time these files are updated on your Amazon S3 bucket. Adding or updating DAGs Directed Acyclic Graphs (DAGs) are defined within a Python file that defines the DAG's structure as code. You can use the AWS CLI, or the Amazon S3 console to upload DAGs to your environment. This topic describes the steps to add or update Apache Airflow DAGs on your Amazon Managed Workflows for Apache Airflow environment using the dags folder in your Amazon S3 bucket. Sections • Prerequisites • How it works • What's changed in v2 • Testing DAGs using the Amazon MWAA CLI utility • Uploading DAG code to Amazon S3 • Specifying the path to your DAGs folder on the Amazon MWAA console (the first time) • Viewing changes on your Apache Airflow UI • What's next? Prerequisites You'll need the following before you can complete the steps on this page. • Permissions — Your AWS account must have been granted access by your administrator to the AmazonMWAAFullConsoleAccess access control policy for your environment. In addition, your Amazon MWAA environment must be permitted by your execution role to access the AWS resources used by your environment. • Access — If you require access to public repositories to install dependencies directly on the web server, your environment must be configured with public network web server access. For more information, see the section called “Apache Airflow access modes”. • Amazon S3 configuration — The Amazon S3 bucket used to store your DAGs, custom plugins in plugins.zip, and Python dependencies in requirements.txt must be configured with Public
amazon-mwaa-user-guide-052
amazon-mwaa-user-guide.pdf
52
administrator to the AmazonMWAAFullConsoleAccess access control policy for your environment. In addition, your Amazon MWAA environment must be permitted by your execution role to access the AWS resources used by your environment. • Access — If you require access to public repositories to install dependencies directly on the web server, your environment must be configured with public network web server access. For more information, see the section called “Apache Airflow access modes”. • Amazon S3 configuration — The Amazon S3 bucket used to store your DAGs, custom plugins in plugins.zip, and Python dependencies in requirements.txt must be configured with Public Access Blocked and Versioning Enabled. Adding or updating DAGs 176 Amazon Managed Workflows for Apache Airflow User Guide How it works A Directed Acyclic Graph (DAG) is defined within a single Python file that defines the DAG's structure as code. It consists of the following: • A DAG definition. • Operators that describe how to run the DAG and the tasks to run. • Operator relationships that describe the order in which to run the tasks. To run an Apache Airflow platform on an Amazon MWAA environment, you need to copy your DAG definition to the dags folder in your storage bucket. For example, the DAG folder in your storage bucket may look like this: Example DAG folder dags/ # dag_def.py Amazon MWAA automatically syncs new and changed objects from your Amazon S3 bucket to Amazon MWAA scheduler and worker containers’ /usr/local/airflow/dags folder every 30 seconds, preserving the Amazon S3 source’s file hierarchy, regardless of file type. The time that new DAGs take to appear in your Apache Airflow UI is controlled by scheduler.dag_dir_list_interval. Changes to existing DAGs will be picked up on the next DAG processing loop. Note You do not need to include the airflow.cfg configuration file in your DAG folder. You can override the default Apache Airflow configurations from the Amazon MWAA console. For more information, see Using Apache Airflow configuration options on Amazon MWAA. What's changed in v2 • New: Operators, Hooks, and Executors. The import statements in your DAGs, and the custom plugins you specify in a plugins.zip on Amazon MWAA have changed between Apache Airflow v1 and Apache Airflow v2. For example, from How it works 177 Amazon Managed Workflows for Apache Airflow User Guide airflow.contrib.hooks.aws_hook import AwsHook in Apache Airflow v1 has changed to from airflow.providers.amazon.aws.hooks.base_aws import AwsBaseHook in Apache Airflow v2. To learn more, see Python API Reference in the Apache Airflow reference guide. Testing DAGs using the Amazon MWAA CLI utility • The command line interface (CLI) utility replicates an Amazon Managed Workflows for Apache Airflow environment locally. • The CLI builds a Docker container image locally that’s similar to an Amazon MWAA production image. This allows you to run a local Apache Airflow environment to develop and test DAGs, custom plugins, and dependencies before deploying to Amazon MWAA. • To run the CLI, see the aws-mwaa-local-runner on GitHub. Uploading DAG code to Amazon S3 You can use the Amazon S3 console or the AWS Command Line Interface (AWS CLI) to upload DAG code to your Amazon S3 bucket. The following steps assume you are uploading code (.py) to a folder named dags in your Amazon S3 bucket. Using the AWS CLI The AWS Command Line Interface (AWS CLI) is an open source tool that enables you to interact with AWS services using commands in your command-line shell. To complete the steps on this page, you need the following: • AWS CLI – Install version 2. • AWS CLI – Quick configuration with aws configure. To upload using the AWS CLI 1. Use the following command to list all of your Amazon S3 buckets. aws s3 ls 2. Use the following command to list the files and folders in the Amazon S3 bucket for your environment. Testing DAGs using the Amazon MWAA CLI utility 178 Amazon Managed Workflows for Apache Airflow User Guide aws s3 ls s3://YOUR_S3_BUCKET_NAME 3. The following command uploads a dag_def.py file to a dags folder. aws s3 cp dag_def.py s3://YOUR_S3_BUCKET_NAME/dags/ If a folder named dags does not already exist on your Amazon S3 bucket, this command creates the dags folder and uploads the file named dag_def.py to the new folder. Using the Amazon S3 console The Amazon S3 console is a web-based user interface that allows you to create and manage the resources in your Amazon S3 bucket. The following steps assume you have a DAGs folder named dags. To upload using the Amazon S3 console 1. Open the Environments page on the Amazon MWAA console. 2. Choose an environment. 3. Select the S3 bucket link in the DAG code in S3 pane to open your storage bucket on the Amazon S3 console. 4. Choose the dags folder. 5. Choose Upload. 6. Choose Add file. 7. Select
amazon-mwaa-user-guide-053
amazon-mwaa-user-guide.pdf
53
the new folder. Using the Amazon S3 console The Amazon S3 console is a web-based user interface that allows you to create and manage the resources in your Amazon S3 bucket. The following steps assume you have a DAGs folder named dags. To upload using the Amazon S3 console 1. Open the Environments page on the Amazon MWAA console. 2. Choose an environment. 3. Select the S3 bucket link in the DAG code in S3 pane to open your storage bucket on the Amazon S3 console. 4. Choose the dags folder. 5. Choose Upload. 6. Choose Add file. 7. Select the local copy of your dag_def.py, choose Upload. Specifying the path to your DAGs folder on the Amazon MWAA console (the first time) The following steps assume you are specifying the path to a folder on your Amazon S3 bucket named dags. 1. Open the Environments page on the Amazon MWAA console. 2. Choose the environment where you want to run DAGs. 3. Choose Edit. Specifying the path to a DAGs folder 179 Amazon Managed Workflows for Apache Airflow User Guide 4. On the DAG code in Amazon S3 pane, choose Browse S3 next to the DAG folder field. 5. Select your dags folder. 6. Choose Choose. 7. Choose Next, Update environment. Viewing changes on your Apache Airflow UI Logging into Apache Airflow You need Apache Airflow UI access policy: AmazonMWAAWebServerAccess permissions for your AWS account in AWS Identity and Access Management (IAM) to view your Apache Airflow UI. To access your Apache Airflow UI 1. Open the Environments page on the Amazon MWAA console. 2. Choose an environment. 3. Choose Open Airflow UI. What's next? • Test your DAGs, custom plugins, and Python dependencies locally using the aws-mwaa-local- runner on GitHub. Installing custom plugins Amazon Managed Workflows for Apache Airflow supports Apache Airflow's built-in plugin manager, allowing you to use custom Apache Airflow operators, hooks, sensors, or interfaces. This page describes the steps to install Apache Airflow custom plugins on your Amazon MWAA environment using a plugins.zip file. Contents • Prerequisites • How it works • When to use the plugins • Custom plugins overview Viewing changes on your Apache Airflow UI 180 Amazon Managed Workflows for Apache Airflow User Guide • Custom plugins directory and size limits • Examples of custom plugins • Example using a flat directory structure in plugins.zip • Example using a nested directory structure in plugins.zip • Creating a plugins.zip file • Step one: Test custom plugins using the Amazon MWAA CLI utility • Step two: Create the plugins.zip file • Uploading plugins.zip to Amazon S3 • Using the AWS CLI • Using the Amazon S3 console • Installing custom plugins on your environment • Specifying the path to plugins.zip on the Amazon MWAA console (the first time) • Specifying the plugins.zip version on the Amazon MWAA console • Example use cases for plugins.zip • What's next? Prerequisites You'll need the following before you can complete the steps on this page. • Permissions — Your AWS account must have been granted access by your administrator to the AmazonMWAAFullConsoleAccess access control policy for your environment. In addition, your Amazon MWAA environment must be permitted by your execution role to access the AWS resources used by your environment. • Access — If you require access to public repositories to install dependencies directly on the web server, your environment must be configured with public network web server access. For more information, see the section called “Apache Airflow access modes”. • Amazon S3 configuration — The Amazon S3 bucket used to store your DAGs, custom plugins in plugins.zip, and Python dependencies in requirements.txt must be configured with Public Access Blocked and Versioning Enabled. Prerequisites 181 Amazon Managed Workflows for Apache Airflow User Guide How it works To run custom plugins on your environment, you must do three things: 1. Create a plugins.zip file locally. 2. Upload the local plugins.zip file to your Amazon S3 bucket. 3. Specify the version of this file in the Plugins file field on the Amazon MWAA console. Note If this is the first time you're uploading a plugins.zip to your Amazon S3 bucket, you also need to specify the path to the file on the Amazon MWAA console. You only need to complete this step once. When to use the plugins Plugins are required only for extending the Apache Airflow user interface, as outlined in the Apache Airflow documentation. Custom operators can be placed directly in the /dags folder alongside your DAG code. If you need to create your own integrations with external systems, place them in the /dags folder or a subfolder within it, but not in the plugins.zip folder. In Apache Airflow 2.x, plugins are primarily used for extending the UI. Similarly, other dependencies should not be placed in plugins.zip. Instead,
amazon-mwaa-user-guide-054
amazon-mwaa-user-guide.pdf
54
on the Amazon MWAA console. You only need to complete this step once. When to use the plugins Plugins are required only for extending the Apache Airflow user interface, as outlined in the Apache Airflow documentation. Custom operators can be placed directly in the /dags folder alongside your DAG code. If you need to create your own integrations with external systems, place them in the /dags folder or a subfolder within it, but not in the plugins.zip folder. In Apache Airflow 2.x, plugins are primarily used for extending the UI. Similarly, other dependencies should not be placed in plugins.zip. Instead, they can be stored in a location under the Amazon S3 /dags folder, where they will be synchronized to each Amazon MWAA container before Apache Airflow starts. Note Any file in the /dags folder or in plugins.zip that does not explicitly define an Apache Airflow DAG object must be listed in an .airflowignore file. How it works 182 Amazon Managed Workflows for Apache Airflow User Guide Custom plugins overview Apache Airflow's built-in plugin manager can integrate external features to its core by simply dropping files in an $AIRFLOW_HOME/plugins folder. It allows you to use custom Apache Airflow operators, hooks, sensors, or interfaces. The following section provides an example of flat and nested directory structures in a local development environment and the resulting import statements, which determines the directory structure within a plugins.zip. Custom plugins directory and size limits The Apache Airflow Scheduler and the Workers look for custom plugins during startup on the AWS- managed Fargate container for your environment at /usr/local/airflow/plugins/*. • Directory structure. The directory structure (at /*) is based on the contents of your plugins.zip file. For example, if your plugins.zip contains the operators directory as a top-level directory, then the directory will be extracted to /usr/local/airflow/ plugins/operators on your environment. • Size limit. We recommend a plugins.zip file less than than 1 GB. The larger the size of a plugins.zip file, the longer the startup time on an environment. Although Amazon MWAA doesn't limit the size of a plugins.zip file explicitly, if dependencies can't be installed within ten minutes, the Fargate service will time-out and attempt to rollback the environment to a stable state. Note For environments using Apache Airflow v1.10.12 or Apache Airflow v2.0.2, Amazon MWAA limits outbound traffic on the Apache Airflow web server, and does not allow you to install plugins nor Python dependencies directly on the web server. Starting with Apache Airflow v2.2.2, Amazon MWAA can install plugins and dependencies directly on the web server. Examples of custom plugins The following section uses sample code in the Apache Airflow reference guide to show how to structure your local development environment. Custom plugins overview 183 Amazon Managed Workflows for Apache Airflow User Guide Example using a flat directory structure in plugins.zip Apache Airflow v2 The following example shows a plugins.zip file with a flat directory structure for Apache Airflow v2. Example flat directory with PythonVirtualenvOperator plugins.zip The following example shows the top-level tree of a plugins.zip file for the PythonVirtualenvOperator custom plugin in Creating a custom plugin for Apache Airflow PythonVirtualenvOperator. ### virtual_python_plugin.py Example plugins/virtual_python_plugin.py The following example shows the PythonVirtualenvOperator custom plugin. """ Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. """ from airflow.plugins_manager import AirflowPlugin import airflow.utils.python_virtualenv from typing import List def _generate_virtualenv_cmd(tmp_dir: str, python_bin: str, system_site_packages: bool) -> List[str]: cmd = ['python3','/usr/local/airflow/.local/lib/python3.7/site-packages/ virtualenv', tmp_dir] Examples of custom plugins 184 Amazon Managed Workflows for Apache Airflow User Guide if system_site_packages: cmd.append('--system-site-packages') if python_bin is not None: cmd.append(f'--python={python_bin}') return cmd airflow.utils.python_virtualenv._generate_virtualenv_cmd=_generate_virtualenv_cmd class VirtualPythonPlugin(AirflowPlugin): name = 'virtual_python_plugin' Apache Airflow v1 The following example shows a plugins.zip file with a flat directory structure for Apache Airflow v1. Example flat directory with PythonVirtualenvOperator plugins.zip The following example shows the top-level tree of a plugins.zip file for the PythonVirtualenvOperator custom plugin in Creating a custom plugin for Apache Airflow PythonVirtualenvOperator. ### virtual_python_plugin.py Example plugins/virtual_python_plugin.py The following example shows the PythonVirtualenvOperator custom plugin. from airflow.plugins_manager import AirflowPlugin from airflow.operators.python_operator
amazon-mwaa-user-guide-055
amazon-mwaa-user-guide.pdf
55
virtualenv', tmp_dir] Examples of custom plugins 184 Amazon Managed Workflows for Apache Airflow User Guide if system_site_packages: cmd.append('--system-site-packages') if python_bin is not None: cmd.append(f'--python={python_bin}') return cmd airflow.utils.python_virtualenv._generate_virtualenv_cmd=_generate_virtualenv_cmd class VirtualPythonPlugin(AirflowPlugin): name = 'virtual_python_plugin' Apache Airflow v1 The following example shows a plugins.zip file with a flat directory structure for Apache Airflow v1. Example flat directory with PythonVirtualenvOperator plugins.zip The following example shows the top-level tree of a plugins.zip file for the PythonVirtualenvOperator custom plugin in Creating a custom plugin for Apache Airflow PythonVirtualenvOperator. ### virtual_python_plugin.py Example plugins/virtual_python_plugin.py The following example shows the PythonVirtualenvOperator custom plugin. from airflow.plugins_manager import AirflowPlugin from airflow.operators.python_operator import PythonVirtualenvOperator def _generate_virtualenv_cmd(self, tmp_dir): cmd = ['python3','/usr/local/airflow/.local/lib/python3.7/site-packages/ virtualenv', tmp_dir] if self.system_site_packages: cmd.append('--system-site-packages') if self.python_version is not None: cmd.append('--python=python{}'.format(self.python_version)) return cmd PythonVirtualenvOperator._generate_virtualenv_cmd=_generate_virtualenv_cmd class EnvVarPlugin(AirflowPlugin): Examples of custom plugins 185 Amazon Managed Workflows for Apache Airflow User Guide name = 'virtual_python_plugin' Example using a nested directory structure in plugins.zip Apache Airflow v2 The following example shows a plugins.zip file with separate directories for hooks, operators, and a sensors directory for Apache Airflow v2. Example plugins.zip __init__.py my_airflow_plugin.py hooks/ |-- __init__.py |-- my_airflow_hook.py operators/ |-- __init__.py |-- my_airflow_operator.py |-- hello_operator.py sensors/ |-- __init__.py |-- my_airflow_sensor.py The following example shows the import statements in the DAG (DAGs folder) that uses the custom plugins. Example dags/your_dag.py from airflow import DAG from datetime import datetime, timedelta from operators.my_airflow_operator import MyOperator from sensors.my_airflow_sensor import MySensor from operators.hello_operator import HelloOperator default_args = { 'owner': 'airflow', 'depends_on_past': False, 'start_date': datetime(2018, 1, 1), 'email_on_failure': False, 'email_on_retry': False, 'retries': 1, Examples of custom plugins 186 Amazon Managed Workflows for Apache Airflow User Guide 'retry_delay': timedelta(minutes=5), } with DAG('customdag', max_active_runs=3, schedule_interval='@once', default_args=default_args) as dag: sens = MySensor( task_id='taskA' ) op = MyOperator( task_id='taskB', my_field='some text' ) hello_task = HelloOperator(task_id='sample-task', name='foo_bar') sens >> op >> hello_task Example plugins/my_airflow_plugin.py from airflow.plugins_manager import AirflowPlugin from hooks.my_airflow_hook import * from operators.my_airflow_operator import * class PluginName(AirflowPlugin): name = 'my_airflow_plugin' hooks = [MyHook] operators = [MyOperator] sensors = [MySensor] The following examples show each of the import statements needed in the custom plugin files. Example hooks/my_airflow_hook.py from airflow.hooks.base import BaseHook Examples of custom plugins 187 Amazon Managed Workflows for Apache Airflow User Guide class MyHook(BaseHook): def my_method(self): print("Hello World") Example sensors/my_airflow_sensor.py from airflow.sensors.base import BaseSensorOperator from airflow.utils.decorators import apply_defaults class MySensor(BaseSensorOperator): @apply_defaults def __init__(self, *args, **kwargs): super(MySensor, self).__init__(*args, **kwargs) def poke(self, context): return True Example operators/my_airflow_operator.py from airflow.operators.bash import BaseOperator from airflow.utils.decorators import apply_defaults from hooks.my_airflow_hook import MyHook class MyOperator(BaseOperator): @apply_defaults def __init__(self, my_field, *args, **kwargs): super(MyOperator, self).__init__(*args, **kwargs) self.my_field = my_field def execute(self, context): hook = MyHook('my_conn') Examples of custom plugins 188 Amazon Managed Workflows for Apache Airflow User Guide hook.my_method() Example operators/hello_operator.py from airflow.models.baseoperator import BaseOperator from airflow.utils.decorators import apply_defaults class HelloOperator(BaseOperator): @apply_defaults def __init__( self, name: str, **kwargs) -> None: super().__init__(**kwargs) self.name = name def execute(self, context): message = "Hello {}".format(self.name) print(message) return message Follow the steps in Testing custom plugins using the Amazon MWAA CLI utility, and then Creating a plugins.zip file to zip the contents within your plugins directory. For example, cd plugins. Apache Airflow v1 The following example shows a plugins.zip file with separate directories for hooks, operators, and a sensors directory for Apache Airflow v1.10.12. Example plugins.zip __init__.py my_airflow_plugin.py hooks/ |-- __init__.py |-- my_airflow_hook.py operators/ |-- __init__.py |-- my_airflow_operator.py |-- hello_operator.py Examples of custom plugins 189 Amazon Managed Workflows for Apache Airflow User Guide sensors/ |-- __init__.py |-- my_airflow_sensor.py The following example shows the import statements in the DAG (DAGs folder) that uses the custom plugins. Example dags/your_dag.py from airflow import DAG from datetime import datetime, timedelta from operators.my_operator import MyOperator from sensors.my_sensor import MySensor from operators.hello_operator import HelloOperator default_args = { 'owner': 'airflow', 'depends_on_past': False, 'start_date': datetime(2018, 1, 1), 'email_on_failure': False, 'email_on_retry': False, 'retries': 1, 'retry_delay': timedelta(minutes=5), } with DAG('customdag', max_active_runs=3, schedule_interval='@once', default_args=default_args) as dag: sens = MySensor( task_id='taskA' ) op = MyOperator( task_id='taskB', my_field='some text' ) hello_task = HelloOperator(task_id='sample-task', name='foo_bar') Examples of custom plugins 190 Amazon Managed Workflows for Apache Airflow User Guide sens >> op >> hello_task Example plugins/my_airflow_plugin.py from airflow.plugins_manager import AirflowPlugin from hooks.my_airflow_hook import * from operators.my_airflow_operator import * from utils.my_utils import * class PluginName(AirflowPlugin): name = 'my_airflow_plugin' hooks = [MyHook] operators = [MyOperator] sensors = [MySensor] The following examples show each of the import statements needed in the custom plugin files. Example hooks/my_airflow_hook.py from airflow.hooks.base_hook import BaseHook class MyHook(BaseHook): def my_method(self): print("Hello World") Example sensors/my_airflow_sensor.py from airflow.sensors.base_sensor_operator import BaseSensorOperator from airflow.utils.decorators import apply_defaults class MySensor(BaseSensorOperator): @apply_defaults def __init__(self, *args, **kwargs): Examples of custom plugins 191 Amazon Managed Workflows for Apache Airflow User Guide super(MySensor, self).__init__(*args, **kwargs) def poke(self, context): return True Example operators/my_airflow_operator.py from airflow.operators.bash_operator import BaseOperator from airflow.utils.decorators import apply_defaults from hooks.my_hook import MyHook class MyOperator(BaseOperator): @apply_defaults def __init__(self, my_field, *args, **kwargs): super(MyOperator, self).__init__(*args, **kwargs) self.my_field = my_field def execute(self, context): hook = MyHook('my_conn') hook.my_method() Example operators/hello_operator.py from airflow.models.baseoperator import BaseOperator from airflow.utils.decorators
amazon-mwaa-user-guide-056
amazon-mwaa-user-guide.pdf
56
statements needed in the custom plugin files. Example hooks/my_airflow_hook.py from airflow.hooks.base_hook import BaseHook class MyHook(BaseHook): def my_method(self): print("Hello World") Example sensors/my_airflow_sensor.py from airflow.sensors.base_sensor_operator import BaseSensorOperator from airflow.utils.decorators import apply_defaults class MySensor(BaseSensorOperator): @apply_defaults def __init__(self, *args, **kwargs): Examples of custom plugins 191 Amazon Managed Workflows for Apache Airflow User Guide super(MySensor, self).__init__(*args, **kwargs) def poke(self, context): return True Example operators/my_airflow_operator.py from airflow.operators.bash_operator import BaseOperator from airflow.utils.decorators import apply_defaults from hooks.my_hook import MyHook class MyOperator(BaseOperator): @apply_defaults def __init__(self, my_field, *args, **kwargs): super(MyOperator, self).__init__(*args, **kwargs) self.my_field = my_field def execute(self, context): hook = MyHook('my_conn') hook.my_method() Example operators/hello_operator.py from airflow.models.baseoperator import BaseOperator from airflow.utils.decorators import apply_defaults class HelloOperator(BaseOperator): @apply_defaults def __init__( self, name: str, **kwargs) -> None: super().__init__(**kwargs) self.name = name def execute(self, context): message = "Hello {}".format(self.name) Examples of custom plugins 192 Amazon Managed Workflows for Apache Airflow User Guide print(message) return message Follow the steps in Testing custom plugins using the Amazon MWAA CLI utility, and then Creating a plugins.zip file to zip the contents within your plugins directory. For example, cd plugins. Creating a plugins.zip file The following steps describe the steps we recommend to create a plugins.zip file locally. Step one: Test custom plugins using the Amazon MWAA CLI utility • The command line interface (CLI) utility replicates an Amazon Managed Workflows for Apache Airflow environment locally. • The CLI builds a Docker container image locally that’s similar to an Amazon MWAA production image. This allows you to run a local Apache Airflow environment to develop and test DAGs, custom plugins, and dependencies before deploying to Amazon MWAA. • To run the CLI, see the aws-mwaa-local-runner on GitHub. Step two: Create the plugins.zip file You can use a built-in ZIP archive utility, or any other ZIP utility (such as 7zip) to create a .zip file. Note The built-in zip utility for Windows OS may add subfolders when you create a .zip file. We recommend verifying the contents of the plugins.zip file before uploading to your Amazon S3 bucket to ensure no additional directories were added. 1. Change directories to your local Airflow plugins directory. For example: myproject$ cd plugins 2. Run the following command to ensure that the contents have executable permissions (macOS and Linux only). Creating a plugins.zip file 193 Amazon Managed Workflows for Apache Airflow User Guide plugins$ chmod -R 755 . 3. Zip the contents within your plugins folder. plugins$ zip -r plugins.zip . Uploading plugins.zip to Amazon S3 You can use the Amazon S3 console or the AWS Command Line Interface (AWS CLI) to upload a plugins.zip file to your Amazon S3 bucket. Using the AWS CLI The AWS Command Line Interface (AWS CLI) is an open source tool that enables you to interact with AWS services using commands in your command-line shell. To complete the steps on this page, you need the following: • AWS CLI – Install version 2. • AWS CLI – Quick configuration with aws configure. To upload using the AWS CLI 1. In your command prompt, navigate to the directory where your plugins.zip file is stored. For example: cd plugins 2. Use the following command to list all of your Amazon S3 buckets. aws s3 ls 3. Use the following command to list the files and folders in the Amazon S3 bucket for your environment. aws s3 ls s3://YOUR_S3_BUCKET_NAME 4. Use the following command to upload the plugins.zip file to the Amazon S3 bucket for your environment. Uploading plugins.zip to Amazon S3 194 Amazon Managed Workflows for Apache Airflow User Guide aws s3 cp plugins.zip s3://YOUR_S3_BUCKET_NAME/plugins.zip Using the Amazon S3 console The Amazon S3 console is a web-based user interface that allows you to create and manage the resources in your Amazon S3 bucket. To upload using the Amazon S3 console 1. Open the Environments page on the Amazon MWAA console. 2. Choose an environment. 3. Select the S3 bucket link in the DAG code in S3 pane to open your storage bucket on the Amazon S3 console. 4. Choose Upload. 5. Choose Add file. 6. Select the local copy of your plugins.zip, choose Upload. Installing custom plugins on your environment This section describes how to install the custom plugins you uploaded to your Amazon S3 bucket by specifying the path to the plugins.zip file, and specifying the version of the plugins.zip file each time the zip file is updated. Specifying the path to plugins.zip on the Amazon MWAA console (the first time) If this is the first time you're uploading a plugins.zip to your Amazon S3 bucket, you also need to specify the path to the file on the Amazon MWAA console. You only need to complete this step once. 1. Open the Environments page on the Amazon MWAA console. 2. Choose an environment. 3. Choose Edit. 4. On the DAG code in Amazon S3 pane, choose Browse S3 next
amazon-mwaa-user-guide-057
amazon-mwaa-user-guide.pdf
57
to the plugins.zip file, and specifying the version of the plugins.zip file each time the zip file is updated. Specifying the path to plugins.zip on the Amazon MWAA console (the first time) If this is the first time you're uploading a plugins.zip to your Amazon S3 bucket, you also need to specify the path to the file on the Amazon MWAA console. You only need to complete this step once. 1. Open the Environments page on the Amazon MWAA console. 2. Choose an environment. 3. Choose Edit. 4. On the DAG code in Amazon S3 pane, choose Browse S3 next to the Plugins file - optional field. Installing custom plugins on your environment 195 Amazon Managed Workflows for Apache Airflow User Guide 5. Select the plugins.zip file on your Amazon S3 bucket. 6. Choose Choose. 7. Choose Next, Update environment. Specifying the plugins.zip version on the Amazon MWAA console You need to specify the version of your plugins.zip file on the Amazon MWAA console each time you upload a new version of your plugins.zip in your Amazon S3 bucket. 1. Open the Environments page on the Amazon MWAA console. 2. Choose an environment. 3. Choose Edit. 4. On the DAG code in Amazon S3 pane, choose a plugins.zip version in the dropdown list. 5. Choose Next. Example use cases for plugins.zip • Learn how to create a custom plugin in Custom plugin with Apache Hive and Hadoop. • Learn how to create a custom plugin in Custom plugin to patch PythonVirtualenvOperator . • Learn how to create a custom plugin in Custom plugin with Oracle. • Learn how to create a custom plugin in the section called “Changing a DAG's timezone”. What's next? • Test your DAGs, custom plugins, and Python dependencies locally using the aws-mwaa-local- runner on GitHub. Installing Python dependencies A Python dependency is any package or distribution that is not included in the Apache Airflow base install for your Apache Airflow version on your Amazon Managed Workflows for Apache Airflow environment. This topic describes the steps to install Apache Airflow Python dependencies on your Amazon MWAA environment using a requirements.txt file in your Amazon S3 bucket. Example use cases for plugins.zip 196 Amazon Managed Workflows for Apache Airflow User Guide Contents • Prerequisites • How it works • Python dependencies overview • Python dependencies location and size limits • Creating a requirements.txt file • Step one: Test Python dependencies using the Amazon MWAA CLI utility • Step two: Create the requirements.txt • Uploading requirements.txt to Amazon S3 • Using the AWS CLI • Using the Amazon S3 console • Installing Python dependencies on your environment • Specifying the path to requirements.txt on the Amazon MWAA console (the first time) • Specifying the requirements.txt version on the Amazon MWAA console • Viewing logs for your requirements.txt • What's next? Prerequisites You'll need the following before you can complete the steps on this page. • Permissions — Your AWS account must have been granted access by your administrator to the AmazonMWAAFullConsoleAccess access control policy for your environment. In addition, your Amazon MWAA environment must be permitted by your execution role to access the AWS resources used by your environment. • Access — If you require access to public repositories to install dependencies directly on the web server, your environment must be configured with public network web server access. For more information, see the section called “Apache Airflow access modes”. • Amazon S3 configuration — The Amazon S3 bucket used to store your DAGs, custom plugins in plugins.zip, and Python dependencies in requirements.txt must be configured with Public Access Blocked and Versioning Enabled. Prerequisites 197 Amazon Managed Workflows for Apache Airflow User Guide How it works On Amazon MWAA, you install all Python dependencies by uploading a requirements.txt file to your Amazon S3 bucket, then specifying the version of the file on the Amazon MWAA console each time you update the file. Amazon MWAA runs pip3 install -r requirements.txt to install the Python dependencies on the Apache Airflow scheduler and each of the workers. To run Python dependencies on your environment, you must do three things: 1. Create a requirements.txt file locally. 2. Upload the local requirements.txt to your Amazon S3 bucket. 3. Specify the version of this file in the Requirements file field on the Amazon MWAA console. Note If this is the first time you're creating and uploading a requirements.txt to your Amazon S3 bucket, you also need to specify the path to the file on the Amazon MWAA console. You only need to complete this step once. Python dependencies overview You can install Apache Airflow extras and other Python dependencies from the Python Package Index (PyPi.org), Python wheels (.whl), or Python dependencies hosted on a private PyPi/PEP-503 Compliant Repo on your environment. Python dependencies location and size
amazon-mwaa-user-guide-058
amazon-mwaa-user-guide.pdf
58
S3 bucket. 3. Specify the version of this file in the Requirements file field on the Amazon MWAA console. Note If this is the first time you're creating and uploading a requirements.txt to your Amazon S3 bucket, you also need to specify the path to the file on the Amazon MWAA console. You only need to complete this step once. Python dependencies overview You can install Apache Airflow extras and other Python dependencies from the Python Package Index (PyPi.org), Python wheels (.whl), or Python dependencies hosted on a private PyPi/PEP-503 Compliant Repo on your environment. Python dependencies location and size limits The Apache Airflow Scheduler and the Workers look for the packages in the requirements.txt file and the packages are installed on the environment at /usr/local/airflow/.local/bin. • Size limit. We recommend a requirements.txt file that references libraries whose combined size is less than than 1 GB. The more libraries Amazon MWAA needs to install, the longer the startup time on an environment. Although Amazon MWAA doesn't limit the size of installed libraries explicitly, if dependencies can't be installed within ten minutes, the Fargate service will time-out and attempt to rollback the environment to a stable state. How it works 198 Amazon Managed Workflows for Apache Airflow User Guide Creating a requirements.txt file The following steps describe the steps we recommend to create a requirements.txt file locally. Step one: Test Python dependencies using the Amazon MWAA CLI utility • The command line interface (CLI) utility replicates an Amazon Managed Workflows for Apache Airflow environment locally. • The CLI builds a Docker container image locally that’s similar to an Amazon MWAA production image. This allows you to run a local Apache Airflow environment to develop and test DAGs, custom plugins, and dependencies before deploying to Amazon MWAA. • To run the CLI, see the aws-mwaa-local-runner on GitHub. Step two: Create the requirements.txt The following section describes how to specify Python dependencies from the Python Package Index in a requirements.txt file. Apache Airflow v2 1. Test locally. Add additional libraries iteratively to find the right combination of packages and their versions, before creating a requirements.txt file. To run the Amazon MWAA CLI utility, see the aws-mwaa-local-runner on GitHub. 2. Review the Apache Airflow package extras. To view a list of the packages installed for Apache Airflow v2 on Amazon MWAA, see Amazon MWAA local runner requirements.txt on the GitHub website. 3. Add a constraints statement. Add the constraints file for your Apache Airflow v2 environment at the top of your requirements.txt file. Apache Airflow constraints files specify the provider versions available at the time of a Apache Airflow release. Beginning with Apache Airflow v2.7.2, your requirements file must include a -- constraint statement. If you do not provide a constraint, Amazon MWAA will specify one for you to ensure the packages listed in your requirements are compatible with the version of Apache Airflow you are using. In the following example, replace {environment-version} with your environment's version number, and {Python-version} with the version of Python that's compatible with your environment. Creating a requirements.txt file 199 Amazon Managed Workflows for Apache Airflow User Guide For information on the version of Python compatible with your Apache Airflow environment, see Apache Airflow Versions. --constraint "https://raw.githubusercontent.com/apache/airflow/ constraints-{Airflow-version}/constraints-{Python-version}.txt" If the constraints file determines that xyz==1.0 package is not compatible with other packages in your environment, pip3 install will fail in order to prevent incompatible libraries from being installed to your environment. If installation fails for any packages, you can view error logs for each Apache Airflow component (the scheduler, worker, and web server) in the corresponding log stream on CloudWatch Logs. For more information on log types, see the section called “Viewing Airflow logs”. 4. Apache Airflow packages. Add the package extras and the version (==). This helps to prevent packages of the same name, but different version, from being installed on your environment. apache-airflow[package-extra]==2.5.1 5. Python libraries. Add the package name and the version (==) in your requirements.txt file. This helps to prevent a future breaking update from PyPi.org from being automatically applied. library == version Example Boto3 and psycopg2-binary This example is provided for demonstration purposes. The boto and psycopg2-binary libraries are included with the Apache Airflow v2 base install and don't need to be specified in a requirements.txt file. boto3==1.17.54 boto==2.49.0 botocore==1.20.54 psycopg2-binary==2.8.6 Creating a requirements.txt file 200 Amazon Managed Workflows for Apache Airflow User Guide If a package is specified without a version, Amazon MWAA installs the latest version of the package from PyPi.org. This version may conflict with other packages in your requirements.txt. Apache Airflow v1 1. Test locally. Add additional libraries iteratively to find the right combination of packages and their versions, before creating a requirements.txt file. To run the Amazon MWAA CLI utility, see the aws-mwaa-local-runner on GitHub. 2. Review the Airflow
amazon-mwaa-user-guide-059
amazon-mwaa-user-guide.pdf
59
v2 base install and don't need to be specified in a requirements.txt file. boto3==1.17.54 boto==2.49.0 botocore==1.20.54 psycopg2-binary==2.8.6 Creating a requirements.txt file 200 Amazon Managed Workflows for Apache Airflow User Guide If a package is specified without a version, Amazon MWAA installs the latest version of the package from PyPi.org. This version may conflict with other packages in your requirements.txt. Apache Airflow v1 1. Test locally. Add additional libraries iteratively to find the right combination of packages and their versions, before creating a requirements.txt file. To run the Amazon MWAA CLI utility, see the aws-mwaa-local-runner on GitHub. 2. Review the Airflow package extras. Review the list of packages available for Apache Airflow v1.10.12 at https://raw.githubusercontent.com/apache/airflow/ constraints-1.10.12/constraints-3.7.txt. 3. Add the constraints file. Add the constraints file for Apache Airflow v1.10.12 to the top of your requirements.txt file. If the constraints file determines that xyz==1.0 package is not compatible with other packages on your environment, the pip3 install will fail to prevent incompatible libraries from being installed to your environment. --constraint "https://raw.githubusercontent.com/apache/airflow/ constraints-1.10.12/constraints-3.7.txt" 4. Apache Airflow v1.10.12 packages. Add the Airflow package extras and the Apache Airflow v1.10.12 version (==). This helps to prevent packages of the same name, but different version, from being installed on your environment. apache-airflow[package]==1.10.12 Example Secure Shell (SSH) The following example requirements.txt file installs SSH for Apache Airflow v1.10.12. apache-airflow[ssh]==1.10.12 5. Python libraries. Add the package name and the version (==) in your requirements.txt file. This helps to prevent a future breaking update from PyPi.org from being automatically applied. library == version Creating a requirements.txt file 201 Amazon Managed Workflows for Apache Airflow User Guide Example Boto3 The following example requirements.txt file installs the Boto3 library for Apache Airflow v1.10.12. boto3 == 1.17.4 If a package is specified without a version, Amazon MWAA installs the latest version of the package from PyPi.org. This version may conflict with other packages in your requirements.txt. Uploading requirements.txt to Amazon S3 You can use the Amazon S3 console or the AWS Command Line Interface (AWS CLI) to upload a requirements.txt file to your Amazon S3 bucket. Using the AWS CLI The AWS Command Line Interface (AWS CLI) is an open source tool that enables you to interact with AWS services using commands in your command-line shell. To complete the steps on this page, you need the following: • AWS CLI – Install version 2. • AWS CLI – Quick configuration with aws configure. To upload using the AWS CLI 1. Use the following command to list all of your Amazon S3 buckets. aws s3 ls 2. Use the following command to list the files and folders in the Amazon S3 bucket for your environment. aws s3 ls s3://YOUR_S3_BUCKET_NAME 3. The following command uploads a requirements.txt file to an Amazon S3 bucket. Uploading requirements.txt to Amazon S3 202 Amazon Managed Workflows for Apache Airflow User Guide aws s3 cp requirements.txt s3://YOUR_S3_BUCKET_NAME/requirements.txt Using the Amazon S3 console The Amazon S3 console is a web-based user interface that allows you to create and manage the resources in your Amazon S3 bucket. To upload using the Amazon S3 console 1. Open the Environments page on the Amazon MWAA console. 2. Choose an environment. 3. Select the S3 bucket link in the DAG code in S3 pane to open your storage bucket on the Amazon S3 console. 4. Choose Upload. 5. Choose Add file. 6. Select the local copy of your requirements.txt, choose Upload. Installing Python dependencies on your environment This section describes how to install the dependencies you uploaded to your Amazon S3 bucket by specifying the path to the requirements.txt file, and specifying the version of the requirements.txt file each time it's updated. Specifying the path to requirements.txt on the Amazon MWAA console (the first time) If this is the first time you're creating and uploading a requirements.txt to your Amazon S3 bucket, you also need to specify the path to the file on the Amazon MWAA console. You only need to complete this step once. 1. Open the Environments page on the Amazon MWAA console. 2. Choose an environment. 3. Choose Edit. 4. On the DAG code in Amazon S3 pane, choose Browse S3 next to the Requirements file - optional field. Installing Python dependencies on your environment 203 Amazon Managed Workflows for Apache Airflow User Guide 5. Select the requirements.txt file on your Amazon S3 bucket. 6. Choose Choose. 7. Choose Next, Update environment. You can begin using the new packages immediately after your environment finishes updating. Specifying the requirements.txt version on the Amazon MWAA console You need to specify the version of your requirements.txt file on the Amazon MWAA console each time you upload a new version of your requirements.txt in your Amazon S3 bucket. 1. Open the Environments page on the Amazon MWAA console. 2. Choose an environment. 3. Choose Edit.
amazon-mwaa-user-guide-060
amazon-mwaa-user-guide.pdf
60
dependencies on your environment 203 Amazon Managed Workflows for Apache Airflow User Guide 5. Select the requirements.txt file on your Amazon S3 bucket. 6. Choose Choose. 7. Choose Next, Update environment. You can begin using the new packages immediately after your environment finishes updating. Specifying the requirements.txt version on the Amazon MWAA console You need to specify the version of your requirements.txt file on the Amazon MWAA console each time you upload a new version of your requirements.txt in your Amazon S3 bucket. 1. Open the Environments page on the Amazon MWAA console. 2. Choose an environment. 3. Choose Edit. 4. On the DAG code in Amazon S3 pane, choose a requirements.txt version in the dropdown list. 5. Choose Next, Update environment. You can begin using the new packages immediately after your environment finishes updating. Viewing logs for your requirements.txt You can view Apache Airflow logs for the Scheduler scheduling your workflows and parsing your dags folder. The following steps describe how to open the log group for the Scheduler on the Amazon MWAA console, and view Apache Airflow logs on the CloudWatch Logs console. To view logs for a requirements.txt 1. Open the Environments page on the Amazon MWAA console. 2. Choose an environment. 3. Choose the Airflow scheduler log group on the Monitoring pane. 4. Choose the requirements_install_ip log in Log streams. 5. You should see the list of packages that were installed on the environment at /usr/local/ airflow/.local/bin. For example: Collecting appdirs==1.4.4 (from -r /usr/local/airflow/.local/bin (line 1)) Viewing logs for your requirements.txt 204 Amazon Managed Workflows for Apache Airflow User Guide Downloading https://files.pythonhosted.org/ packages/3b/00/2344469e2084fb28kjdsfiuyweb47389789vxbmnbjhsdgf5463acd6cf5e3db69324/ appdirs-1.4.4-py2.py3-none-any.whl Collecting astroid==2.4.2 (from -r /usr/local/airflow/.local/bin (line 2)) 6. Review the list of packages and whether any of these encountered an error during installation. If something went wrong, you may see an error similar to the following: 2021-03-05T14:34:42.731-07:00 No matching distribution found for LibraryName==1.0.0 (from -r /usr/local/ airflow/.local/bin (line 4)) No matching distribution found for LibraryName==1.0.0 (from -r /usr/local/ airflow/.local/bin (line 4)) What's next? • Test your DAGs, custom plugins, and Python dependencies locally using the aws-mwaa-local- runner on GitHub. Deleting files on Amazon S3 This page describes how versioning works in an Amazon S3 bucket for an Amazon Managed Workflows for Apache Airflow environment, and the steps to delete a DAG, plugins.zip, or requirements.txt file. Contents • Prerequisites • Versioning overview • How it works • Deleting a DAG on Amazon S3 • Removing a "current" requirements.txt or plugins.zip from an environment • Deleting a "non-current" (previous) requirements.txt or plugins.zip version • Using lifecycles to delete "non-current" (previous) versions and delete markers automatically • Example lifecycle policy to delete requirements.txt "non-current" versions and delete markers automatically • What's next? What's next? 205 Amazon Managed Workflows for Apache Airflow User Guide Prerequisites You'll need the following before you can complete the steps on this page. • Permissions — Your AWS account must have been granted access by your administrator to the AmazonMWAAFullConsoleAccess access control policy for your environment. In addition, your Amazon MWAA environment must be permitted by your execution role to access the AWS resources used by your environment. • Access — If you require access to public repositories to install dependencies directly on the web server, your environment must be configured with public network web server access. For more information, see the section called “Apache Airflow access modes”. • Amazon S3 configuration — The Amazon S3 bucket used to store your DAGs, custom plugins in plugins.zip, and Python dependencies in requirements.txt must be configured with Public Access Blocked and Versioning Enabled. Versioning overview The requirements.txt and plugins.zip in your Amazon S3 bucket are versioned. When Amazon S3 bucket versioning is enabled for an object, and an artifact (for example, plugins.zip) is deleted from an Amazon S3 bucket, the file doesn't get deleted entirely. Anytime an artifact is deleted on Amazon S3, a new copy of the file is created that is a 404 (Object not found) error/0k file that says "I'm not here." Amazon S3 calls this a delete marker. A delete marker is a "null" version of the file with a key name (or key) and version ID like any other object. We recommend deleting file versions and delete markers periodically to reduce storage costs for your Amazon S3 bucket. To delete "non-current" (previous) file versions entirely, you must delete the versions of the file(s), and then the delete marker for the version. How it works Amazon MWAA runs a sync operation on your Amazon S3 bucket every thirty seconds. This causes any DAG deletions in an Amazon S3 bucket to be synced to the Airflow image of your Fargate container. For plugins.zip and requirements.txt files, changes occur only after an environment update when Amazon MWAA builds a new Airflow image of your Fargate container with the custom plugins and
amazon-mwaa-user-guide-061
amazon-mwaa-user-guide.pdf
61
periodically to reduce storage costs for your Amazon S3 bucket. To delete "non-current" (previous) file versions entirely, you must delete the versions of the file(s), and then the delete marker for the version. How it works Amazon MWAA runs a sync operation on your Amazon S3 bucket every thirty seconds. This causes any DAG deletions in an Amazon S3 bucket to be synced to the Airflow image of your Fargate container. For plugins.zip and requirements.txt files, changes occur only after an environment update when Amazon MWAA builds a new Airflow image of your Fargate container with the custom plugins and Python dependencies. If you delete the current version of any of a Prerequisites 206 Amazon Managed Workflows for Apache Airflow User Guide requirements.txt or plugins.zip file, and then update your environment without providing a new version for the deleted file, then the update will fail with an error message, such as, "Unable to read version {version} of file {file}". Deleting a DAG on Amazon S3 A DAG file (.py) is not versioned and can be deleted directly on the Amazon S3 console. The following steps describe how to delete a DAG on your Amazon S3 bucket. To delete a DAG 1. Open the Environments page on the Amazon MWAA console. 2. Choose an environment. 3. Select the S3 bucket link in the DAG code in S3 pane to open your storage bucket on the Amazon S3 console. 4. Choose the dags folder. 5. Select the DAG, Delete. 6. Under Delete objects?, type delete. 7. Choose Delete objects. Note Apache Airflow preserves historical DAG runs. After a DAG has been run in Apache Airflow, it remains in the Airflow DAGs list regardless of the file status, until you delete it in Apache Airflow. To delete a DAG in Apache Airflow, choose the red "delete" button under the Links column. Removing a "current" requirements.txt or plugins.zip from an environment Currently, there isn't a way to remove a plugins.zip or requirements.txt from an environment after they’ve been added, but we're working on the issue. In the interim, a workaround is to point to an empty text or zip file, respectively. Deleting a DAG on Amazon S3 207 Amazon Managed Workflows for Apache Airflow User Guide Deleting a "non-current" (previous) requirements.txt or plugins.zip version The requirements.txt and plugins.zip files in your Amazon S3 bucket are versioned on Amazon MWAA. If you want to delete these files on your Amazon S3 bucket entirely, you must retrieve the current version (121212) of the object (for example, plugins.zip), delete the version, and then remove the delete marker for the file version(s). You can also delete "non-current" (previous) file versions on the Amazon S3 console; however, you'll still need to delete the delete marker using one of the following options. • To retrieve the object version, see Retrieving object versions from a versioning-enabled bucket in the Amazon S3 guide. • To delete the object version, see Deleting object versions from a versioning-enabled bucket in the Amazon S3 guide. • To remove a delete marker, see Managing delete markers in the Amazon S3 guide. Using lifecycles to delete "non-current" (previous) versions and delete markers automatically You can configure a lifecycle policy for your Amazon S3 bucket to delete "non-current" (previous) versions of the plugins.zip and requirements.txt files in your Amazon S3 bucket after a certain number of days, or to remove an expired object's delete marker. 1. Open the Environments page on the Amazon MWAA console. 2. Choose an environment. 3. Under DAG code in Amazon S3, choose your Amazon S3 bucket. 4. Choose Create lifecycle rule. Example lifecycle policy to delete requirements.txt "non-current" versions and delete markers automatically The following example shows how to create a lifecycle rule that permanently deletes "non-current" versions of a requirements.txt file and their delete markers after thirty days. 1. Open the Environments page on the Amazon MWAA console. Delete "non-current" plugins.zip or requirements.txt 208 Amazon Managed Workflows for Apache Airflow 2. Choose an environment. 3. Under DAG code in Amazon S3, choose your Amazon S3 bucket. 4. Choose Create lifecycle rule. User Guide 5. 6. 7. 8. 9. In Lifecycle rule name, type Delete previous requirements.txt versions and delete markers after thirty days. In Prefix, requirements. In Lifecycle rule actions, choose Permanently delete previous versions of objects and Delete expired delete markers or incomplete multipart uploads. In Number of days after objects become previous versions, type 30. In Expired object delete markers, choose Delete expired object delete markers, objects are permanently deleted after 30 days. What's next? • Learn more about Amazon S3 delete markers in Managing delete markers. • Learn more about Amazon S3 lifecycles in Expiring objects. What's next? 209 Amazon Managed Workflows for Apache Airflow User Guide Networking This guide describes the Amazon VPC network setup you'll need for
amazon-mwaa-user-guide-062
amazon-mwaa-user-guide.pdf
62
days. In Prefix, requirements. In Lifecycle rule actions, choose Permanently delete previous versions of objects and Delete expired delete markers or incomplete multipart uploads. In Number of days after objects become previous versions, type 30. In Expired object delete markers, choose Delete expired object delete markers, objects are permanently deleted after 30 days. What's next? • Learn more about Amazon S3 delete markers in Managing delete markers. • Learn more about Amazon S3 lifecycles in Expiring objects. What's next? 209 Amazon Managed Workflows for Apache Airflow User Guide Networking This guide describes the Amazon VPC network setup you'll need for an Amazon MWAA environment. Sections • About networking on Amazon MWAA • Security in your VPC on Amazon MWAA • Managing access to service-specific Amazon VPC endpoints on Amazon MWAA • Creating the required VPC service endpoints in an Amazon VPC with private routing • Managing your own Amazon VPC endpoints on Amazon MWAA About networking on Amazon MWAA An Amazon VPC is a virtual network that is linked to your AWS account. It gives you cloud security and the ability to scale dynamically by providing fine-grained control over your virtual infrastructure and network traffic segmentation. This page describes the Amazon VPC infrastructure with public routing or private routing that's needed to support an Amazon Managed Workflows for Apache Airflow environment. Contents • Terms • What's supported • VPC infrastructure overview • Public routing over the Internet • Private routing without Internet access • Example use cases for an Amazon VPC and Apache Airflow access mode • Internet access is allowed - new Amazon VPC network • Internet access is not allowed - new Amazon VPC network • Internet access is not allowed - existing Amazon VPC network About networking 210 Amazon Managed Workflows for Apache Airflow User Guide Terms Public routing An Amazon VPC network that has access to the Internet. Private routing An Amazon VPC network without access to the Internet. What's supported The following table describes the types of Amazon VPCs Amazon MWAA supports. Amazon VPC types Supported An Amazon VPC owned by the account that is attemptin Yes g to create the environment. A shared Amazon VPC where multiple AWS accounts create Yes their AWS resources. VPC infrastructure overview When you create an Amazon MWAA environment, Amazon MWAA creates between one to two VPC endpoints for your environment based on the Apache Airflow access mode you chose for your environment. These endpoints appear as Elastic Network Interfaces (ENIs) with private IPs in your Amazon VPC. After these endpoints are created, any traffic destined to these IPs is privately or publicly routed to the corresponding AWS services used by your environment. The following section describes the Amazon VPC infrastructure required to route traffic publicly over the Internet, or privately within your Amazon VPC. Public routing over the Internet This section describes the Amazon VPC infrastructure of an environment with public routing. You'll need the following VPC infrastructure: Terms 211 Amazon Managed Workflows for Apache Airflow User Guide • One VPC security group. A VPC security group acts as a virtual firewall to control ingress (inbound) and egress (outbound) network traffic on an instance. • Up to 5 security groups can be specified. • The security group must specify a self-referencing inbound rule to itself. • The security group must specify an outbound rule for all traffic (0.0.0.0/0). • The security group must allow all traffic in the self-referencing rule. For example, (Recommended) Example all access self-referencing security group . • The security group can optionally restrict traffic further by specifying the port range for HTTPS port range 443 and a TCP port range 5432. For example, (Optional) Example security group that restricts inbound access to port 5432 and (Optional) Example security group that restricts inbound access to port 443. • Two public subnets. A public subnet is a subnet that's associated with a route table that has a route to an Internet gateway. • Two public subnets are required. This allows Amazon MWAA to build a new container image for your environment in your other availability zone, if one container fails. • The subnets must be in different Availability Zones. For example, us-east-1a, us-east-1b. • The subnets must route to a NAT gateway (or NAT instance) with an Elastic IP Address (EIP). • The subnets must have a route table that directs internet-bound traffic to an Internet gateway. • Two private subnets. A private subnet is a subnet that's not associated with a route table that has a route to an Internet gateway. • Two private subnets are required. This allows Amazon MWAA to build a new container image for your environment in your other availability zone, if one container fails. • The subnets must be in different Availability Zones. For example, us-east-1a, us-east-1b. • The subnets
amazon-mwaa-user-guide-063
amazon-mwaa-user-guide.pdf
63
must route to a NAT gateway (or NAT instance) with an Elastic IP Address (EIP). • The subnets must have a route table that directs internet-bound traffic to an Internet gateway. • Two private subnets. A private subnet is a subnet that's not associated with a route table that has a route to an Internet gateway. • Two private subnets are required. This allows Amazon MWAA to build a new container image for your environment in your other availability zone, if one container fails. • The subnets must be in different Availability Zones. For example, us-east-1a, us-east-1b. • The subnets must have a route table to a NAT device (gateway or instance). • The subnets must not route to an Internet gateway. • A network access control list (ACL). An NACL manages (by allow or deny rules) inbound and outbound traffic at the subnet level. • The NACL must have an inbound rule that allows all traffic (0.0.0.0/0). • The NACL must have an outbound rule that allows all traffic (0.0.0.0/0). • For example, (Recommended) Example ACLs. VPC infrastructure overview 212 Amazon Managed Workflows for Apache Airflow User Guide • Two NAT gateways (or NAT instances). A NAT device forwards traffic from the instances in the private subnet to the Internet or other AWS services, and then routes the response back to the instances. • The NAT device must be attached to a public subnet. (One NAT device per public subnet.) • The NAT device must have an Elastic IPv4 Address (EIP) attached to each public subnet. • An Internet gateway. An Internet gateway connects an Amazon VPC to the Internet and other AWS services. • An Internet gateway must be attached to the Amazon VPC. Private routing without Internet access This section describes the Amazon VPC infrastructure of an environment with private routing. You'll need the following VPC infrastructure: • One VPC security group. A VPC security group acts as a virtual firewall to control ingress (inbound) and egress (outbound) network traffic on an instance. • Up to 5 security groups can be specified. • The security group must specify a self-referencing inbound rule to itself. • The security group must specify an outbound rule for all traffic (0.0.0.0/0). • The security group must allow all traffic in the self-referencing rule. For example, (Recommended) Example all access self-referencing security group . • The security group can optionally restrict traffic further by specifying the port range for HTTPS port range 443 and a TCP port range 5432. For example, (Optional) Example security group that restricts inbound access to port 5432 and (Optional) Example security group that restricts inbound access to port 443. • Two private subnets. A private subnet is a subnet that's not associated with a route table that has a route to an Internet gateway. • Two private subnets are required. This allows Amazon MWAA to build a new container image for your environment in your other availability zone, if one container fails. • The subnets must be in different Availability Zones. For example, us-east-1a, us-east-1b. • The subnets must have a route table to your VPC endpoints. • The subnets must not have a route table to a NAT device (gateway or instance), nor an Internet gateway. VPC infrastructure overview 213 Amazon Managed Workflows for Apache Airflow User Guide • A network access control list (ACL). An NACL manages (by allow or deny rules) inbound and outbound traffic at the subnet level. • The NACL must have an inbound rule that allows all traffic (0.0.0.0/0). • The NACL must have an outbound rule that denies all traffic (0.0.0.0/0). • For example, (Recommended) Example ACLs. • A local route table. A local route table is a default route for communication within the VPC. • The local route table must be associated to your private subnets. • The local route table must enable instances in your VPC to communicate with your own network. For example, if you're using an AWS Client VPN to access the VPC interface endpoint for your Apache Airflow Web server, the route table must route to the VPC endpoint. • VPC endpoints for each AWS service used by your environment, and Apache Airflow VPC endpoints in the same AWS Region and Amazon VPC as your Amazon MWAA environment. • A VPC endpoint for each AWS service used by the environment and VPC endpoints for Apache Airflow. For example, (Required) VPC endpoints. • The VPC endpoints must have private DNS enabled. • The VPC endpoints must be associated to your environment's two private subnets. • The VPC endpoints must be associated to your environment's security group. • The VPC endpoint policy for each endpoint should be configured to allow access to AWS services used by the environment. For example, (Recommended) Example VPC endpoint policy to allow
amazon-mwaa-user-guide-064
amazon-mwaa-user-guide.pdf
64
in the same AWS Region and Amazon VPC as your Amazon MWAA environment. • A VPC endpoint for each AWS service used by the environment and VPC endpoints for Apache Airflow. For example, (Required) VPC endpoints. • The VPC endpoints must have private DNS enabled. • The VPC endpoints must be associated to your environment's two private subnets. • The VPC endpoints must be associated to your environment's security group. • The VPC endpoint policy for each endpoint should be configured to allow access to AWS services used by the environment. For example, (Recommended) Example VPC endpoint policy to allow all access. • A VPC endpoint policy for Amazon S3 should be configured to allow bucket access. For example, (Recommended) Example Amazon S3 gateway endpoint policy to allow bucket access. Example use cases for an Amazon VPC and Apache Airflow access mode This section descibes the different use cases for network access in your Amazon VPC and the Apache Airflow Web server access mode you should choose on the Amazon MWAA console. Internet access is allowed - new Amazon VPC network If Internet access in your VPC is allowed by your organization, and you would like users to access your Apache Airflow Web server over the Internet: 1. Create an Amazon VPC network with Internet access. Example use cases for an Amazon VPC and Apache Airflow access mode 214 Amazon Managed Workflows for Apache Airflow User Guide 2. Create an environment with the Public network access mode for your Apache Airflow Web server. 3. What we recommend: We recommend using the AWS CloudFormation quick-start template that creates the Amazon VPC infrastructure, an Amazon S3 bucket, and an Amazon MWAA environment at the same time. To learn more, see Quick start tutorial for Amazon Managed Workflows for Apache Airflow. If Internet access in your VPC is allowed by your organization, and you would like to limit Apache Airflow Web server access to users within your VPC: 1. Create an Amazon VPC network with Internet access. 2. Create a mechanism to access the VPC interface endpoint for your Apache Airflow Web server from your computer. 3. Create an environment with the Private network access mode for your Apache Airflow Web server. 4. What we recommend: a. We recommend using the Amazon MWAA console in Option one: Creating the VPC network on the Amazon MWAA console, or the AWS CloudFormation template in Option two: Creating an Amazon VPC network with Internet access. b. We recommend configuring access using an AWS Client VPN to your Apache Airflow Web server in Tutorial: Configuring private network access using an AWS Client VPN. Internet access is not allowed - new Amazon VPC network If Internet access in your VPC is not allowed by your organization: 1. Create an Amazon VPC network without Internet access. 2. Create a mechanism to access the VPC interface endpoint for your Apache Airflow Web server from your computer. 3. Create VPC endpoints for each AWS service used by your environment. 4. Create an environment with the Private network access mode for your Apache Airflow Web server. 5. What we recommend: Example use cases for an Amazon VPC and Apache Airflow access mode 215 Amazon Managed Workflows for Apache Airflow User Guide a. We recommend using the AWS CloudFormation template to create an Amazon VPC without Internet access and the VPC endpoints for each AWS service used by Amazon MWAA in Option three: Creating an Amazon VPC network without Internet access. b. We recommend configuring access using an AWS Client VPN to your Apache Airflow Web server in Tutorial: Configuring private network access using an AWS Client VPN. Internet access is not allowed - existing Amazon VPC network If Internet access in your VPC is not allowed by your organization, and you already have the required Amazon VPC network without Internet access: 1. Create VPC endpoints for each AWS service used by your environment. 2. Create VPC endpoints for Apache Airflow. 3. Create a mechanism to access the VPC interface endpoint for your Apache Airflow Web server from your computer. 4. Create an environment with the Private network access mode for your Apache Airflow Web server. 5. What we recommend: a. We recommend creating and attaching the VPC endpoints needed for each AWS service used by Amazon MWAA, and the VPC endpoints needed for Apache Airflow in Creating the required VPC service endpoints in an Amazon VPC with private routing. b. We recommend configuring access using an AWS Client VPN to your Apache Airflow Web server in Tutorial: Configuring private network access using an AWS Client VPN. Security in your VPC on Amazon MWAA This page describes the Amazon VPC components used to secure your Amazon Managed Workflows for Apache Airflow environment and the configurations needed for these components. Contents • Terms •
amazon-mwaa-user-guide-065
amazon-mwaa-user-guide.pdf
65
creating and attaching the VPC endpoints needed for each AWS service used by Amazon MWAA, and the VPC endpoints needed for Apache Airflow in Creating the required VPC service endpoints in an Amazon VPC with private routing. b. We recommend configuring access using an AWS Client VPN to your Apache Airflow Web server in Tutorial: Configuring private network access using an AWS Client VPN. Security in your VPC on Amazon MWAA This page describes the Amazon VPC components used to secure your Amazon Managed Workflows for Apache Airflow environment and the configurations needed for these components. Contents • Terms • Security overview • Network access control lists (ACLs) Security in your VPC 216 Amazon Managed Workflows for Apache Airflow User Guide • (Recommended) Example ACLs • VPC security groups • (Recommended) Example all access self-referencing security group • (Optional) Example security group that restricts inbound access to port 5432 • (Optional) Example security group that restricts inbound access to port 443 • VPC endpoint policies (private routing only) • (Recommended) Example VPC endpoint policy to allow all access • (Recommended) Example Amazon S3 gateway endpoint policy to allow bucket access Terms Public routing An Amazon VPC network that has access to the Internet. Private routing An Amazon VPC network without access to the Internet. Security overview Security groups and access control lists (ACLs) provide ways to control the network traffic across the subnets and instances in your Amazon VPC using rules you specify. • Network traffic to and from a subnet can be controlled by Access Control Lists (ACLs). You only need one ACL, and the same ACL can be used on multiple environments. • Network traffic to and from an instance can be controlled by an Amazon VPC security group. You can use between one to five security groups per environment. • Network traffic to and from an instance can also be controlled by VPC endpoint policies. If Internet access within your Amazon VPC is not allowed by your organization and you're using an Amazon VPC network with private routing, a VPC endpoint policy is required for the AWS VPC endpoints and Apache Airflow VPC endpoints. Terms 217 Amazon Managed Workflows for Apache Airflow User Guide Network access control lists (ACLs) A network access control list (ACL) can manage (by allow or deny rules) inbound and outbound traffic at the subnet level. An ACL is stateless, which means that inbound and outbound rules must be specified separately and explicitly. It is used to specify the types of network traffic that are allowed in or out from the instances in a VPC network. Every Amazon VPC has a default ACL that allows all inbound and outbound traffic. You can edit the default ACL rules, or create a custom ACL and attach it to your subnets. A subnet can only have one ACL attached to it at any time, but one ACL can be attached to multiple subnets. (Recommended) Example ACLs The following example shows the inbound and outbound ACL rules that can be used for an Amazon VPC with public routing or private routing. Rule number Type Protocol Port range Source Allow/Deny 100 * All IPv4 traffic All IPv4 traffic All All VPC security groups All All 0.0.0.0/0 Allow 0.0.0.0/0 Deny A VPC security group acts as a virtual firewall that controls the network traffic at the instance level. A security group is stateful, which means that when an inbound connection is permitted, it is allowed to reply. It is used to specify the types of network traffic that are allowed in from the instances in a VPC network. Every Amazon VPC has a default security group. By default, it has no inbound rules. It has an outbound rule that allows all outbound traffic. You can edit the default security group rules, or create a custom security group and attach it to your Amazon VPC. On Amazon MWAA, you need to configure inbound and outbound rules to direct traffic on your NAT gateways. Network access control lists (ACLs) 218 Amazon Managed Workflows for Apache Airflow User Guide (Recommended) Example all access self-referencing security group The following example shows the inbound security group rules that allows all traffic for an Amazon VPC with public routing or private routing. The security group in this example is a self-referencing rule to itself. Type Protocol Source Type Source All traffic All All sg-0909e8 e81919 / my-mwaa-v pc-security- group The following example shows the outbound security group rules. Type Protocol Source Type Source All traffic All All 0.0.0.0/0 (Optional) Example security group that restricts inbound access to port 5432 The following example shows the inbound security group rules that allow all HTTPS traffic on port 5432 for the Amazon Aurora PostgreSQL metadata database (owned by Amazon MWAA) for your environment. Note If you choose to
amazon-mwaa-user-guide-066
amazon-mwaa-user-guide.pdf
66
VPC with public routing or private routing. The security group in this example is a self-referencing rule to itself. Type Protocol Source Type Source All traffic All All sg-0909e8 e81919 / my-mwaa-v pc-security- group The following example shows the outbound security group rules. Type Protocol Source Type Source All traffic All All 0.0.0.0/0 (Optional) Example security group that restricts inbound access to port 5432 The following example shows the inbound security group rules that allow all HTTPS traffic on port 5432 for the Amazon Aurora PostgreSQL metadata database (owned by Amazon MWAA) for your environment. Note If you choose to restrict traffic using this rule, you'll need to add another rule to allow TCP traffic on port 443. Type Protocol Port range Source type Source Custom TCP TCP 5432 Custom sg-0909e8 e81919 / VPC security groups 219 Amazon Managed Workflows for Apache Airflow User Guide Type Protocol Port range Source type Source my-mwaa-v pc-security- group (Optional) Example security group that restricts inbound access to port 443 The following example shows the inbound security group rules that allow all TCP traffic on port 443 for the Apache Airflow Web server. Type Protocol Port range Source type Source HTTPS TCP 443 Custom sg-0909e8 e81919 / my-mwaa-v pc-security- group VPC endpoint policies (private routing only) A VPC endpoint (AWS PrivateLink) policy controls access to AWS services from your private subnet. A VPC endpoint policy is an IAM resource policy that you attach to your VPC gateway or interface endpoint. This section describes the permissions needed for the VPC endpoint policies for each VPC endpoint. We recommend using a VPC interface endpoint policy for each of the VPC endpoints you created that allows full access to all AWS services, and using your execution role exclusively for AWS permissions. (Recommended) Example VPC endpoint policy to allow all access The following example shows a VPC interface endpoint policy for an Amazon VPC with private routing. { "Statement": [ { VPC endpoint policies (private routing only) 220 Amazon Managed Workflows for Apache Airflow User Guide "Action": "*", "Effect": "Allow", "Resource": "*", "Principal": "*" } ] } (Recommended) Example Amazon S3 gateway endpoint policy to allow bucket access The following example shows a VPC gateway endpoint policy that provides access to the Amazon S3 buckets required for Amazon ECR operations for an Amazon VPC with private routing. This is required for your Amazon ECR image to be retrieved, in addition to the bucket where your DAGs and supporting files are stored. { "Statement": [ { "Sid": "Access-to-specific-bucket-only", "Principal": "*", "Action": [ "s3:GetObject" ], "Effect": "Allow", "Resource": ["arn:aws:s3:::prod-region-starport-layer-bucket/*"] } ] } Managing access to service-specific Amazon VPC endpoints on Amazon MWAA A VPC endpoint (AWS PrivateLink) enables you to privately connect your VPC to services hosted on AWS without requiring an Internet gateway, a NAT device, VPN, or firewall proxies. These endpoints are horizontally scalable and highly available virtual devices that allow communication between instances in your VPC and AWS services. This page describes the VPC endpoints created by Amazon MWAA, and how to access the VPC endpoint for your Apache Airflow Web server if you've chosen the Private network access mode on Amazon Managed Workflows for Apache Airflow. Managing access to VPC endpoints 221 Amazon Managed Workflows for Apache Airflow User Guide Contents • Pricing • VPC endpoint overview • Public network access mode • Private network access mode • Permission to use other AWS services • Viewing VPC endpoints • Viewing VPC endpoints on the Amazon VPC console • Identifying the private IP addresses of your Apache Airflow Web server and its VPC endpoint • Accessing the VPC endpoint for your Apache Airflow Web server (private network access) • Using an AWS Client VPN • Using a Linux Bastion Host • Using a Load Balancer (advanced) Pricing • AWS PrivateLink Pricing VPC endpoint overview When you create an Amazon MWAA environment, Amazon MWAA creates between one to two VPC endpoints for your environment. These endpoints appear as Elastic Network Interfaces (ENIs) with private IPs in your Amazon VPC. After these endpoints are created, any traffic destined to these IPs is privately or publicly routed to the corresponding AWS services used by your environment. Public network access mode If you chose the Public network access mode for your Apache Airflow Web server, network traffic is publicly routed over the Internet. • Amazon MWAA creates a VPC interface endpoint for your Amazon Aurora PostgreSQL metadata database. The endpoint is created in the Availability Zones mapped to your private subnets and is independent from other AWS accounts. Pricing 222 Amazon Managed Workflows for Apache Airflow User Guide • Amazon MWAA then binds an IP address from your private subnets to the interface endpoints. This is designed to support the best practice of binding a single IP from each Availability Zone
amazon-mwaa-user-guide-067
amazon-mwaa-user-guide.pdf
67
If you chose the Public network access mode for your Apache Airflow Web server, network traffic is publicly routed over the Internet. • Amazon MWAA creates a VPC interface endpoint for your Amazon Aurora PostgreSQL metadata database. The endpoint is created in the Availability Zones mapped to your private subnets and is independent from other AWS accounts. Pricing 222 Amazon Managed Workflows for Apache Airflow User Guide • Amazon MWAA then binds an IP address from your private subnets to the interface endpoints. This is designed to support the best practice of binding a single IP from each Availability Zone of the Amazon VPC. Private network access mode If you chose the Private network access mode for your Apache Airflow Web server, network traffic is privately routed within your Amazon VPC. • Amazon MWAA creates a VPC interface endpoint for your Apache Airflow Web server, and an interface endpoint for your Amazon Aurora PostgreSQL metadata database. The endpoints are created in the Availability Zones mapped to your private subnets and is independent from other AWS accounts. • Amazon MWAA then binds an IP address from your private subnets to the interface endpoints. This is designed to support the best practice of binding a single IP from each Availability Zone of the Amazon VPC. Permission to use other AWS services The interface endpoints use the execution role for your environment in AWS Identity and Access Management (IAM) to manage permission to AWS resources used by your environment. As more AWS services are enabled for an environment, each service will require you to configure permission using your environment's execution role. To add permissions, see Amazon MWAA execution role. If you've chosen the Private network access mode for your Apache Airflow Web server, you must also allow permission in the VPC endpoint policy for each endpoint. To learn more, see the section called “VPC endpoint policies (private routing only)”. Viewing VPC endpoints This section describes how to view the VPC endpoints created by Amazon MWAA, and how to identify the private IP addresses for your Apache Airflow VPC endpoint. Viewing VPC endpoints on the Amazon VPC console The following section shows the steps to view the VPC endpoint(s) created by Amazon MWAA, and any VPC endpoints you may have created if you're using private routing for your Amazon VPC. Permission to use other AWS services 223 Amazon Managed Workflows for Apache Airflow To view the VPC endpoint(s) User Guide 1. Open the Endpoints page on the Amazon VPC console. 2. Use the AWS Region selector to select your region. 3. You should see the VPC interface endpoint(s) created by Amazon MWAA, and any VPC endpoints you may have created if you're using private routing in your Amazon VPC. To learn more about the VPC service endpoints that are required for an Amazon VPC with private routing, see Creating the required VPC service endpoints in an Amazon VPC with private routing. Identifying the private IP addresses of your Apache Airflow Web server and its VPC endpoint The following steps describe how to retrieve the host name of your Apache Airflow Web server and its VPC interface endpoint, and their private IP addresses. 1. Use the following AWS Command Line Interface (AWS CLI) command to retrieve the host name for your Apache Airflow Web server. aws mwaa get-environment --name YOUR_ENVIRONMENT_NAME --query 'Environment.WebserverUrl' You should see something similar to the following response: "99aa99aa-55aa-44a1-a91f-f4552cf4e2f5-vpce.c10.us-west-2.airflow.amazonaws.com" 2. Run a dig command on the host name returned in the response of the previous command. For example: dig CNAME +short 99aa99aa-55aa-44a1-a91f-f4552cf4e2f5-vpce.c10.us- west-2.airflow.amazonaws.com You should see something similar to the following response: vpce-0699aa333a0a0a0-bf90xjtr.vpce-svc-00bb7c2ca2213bc37.us- west-2.vpce.amazonaws.com. 3. Use the following AWS Command Line Interface (AWS CLI) command to retrieve the VPC endpoint DNS name returned in the response of the previous command. For example: Viewing VPC endpoints 224 Amazon Managed Workflows for Apache Airflow User Guide aws ec2 describe-vpc-endpoints | grep vpce-0699aa333a0a0a0-bf90xjtr.vpce- svc-00bb7c2ca2213bc37.us-west-2.vpce.amazonaws.com. You should see something similar to the following response: "DnsName": "vpce-066777a0a0a0-bf90xjtr.vpce-svc-00bb7c2ca2213bc37.us- west-2.vpce.amazonaws.com", 4. Run either an nslookup or dig command on your Apache Airflow host name and its VPC endpoint DNS name to retrieve the IP addresses. For example: dig +short YOUR_AIRFLOW_HOST_NAME YOUR_AIRFLOW_VPC_ENDPOINT_DNS You should see something similar to the following response: 192.0.5.1 192.0.6.1 Accessing the VPC endpoint for your Apache Airflow Web server (private network access) If you've chosen the Private network access mode for your Apache Airflow Web server, you'll need to create a mechanism to access the VPC interface endpoint for your Apache Airflow Web server. You must use the same Amazon VPC, VPC security group, and private subnets as your Amazon MWAA environment for these resources. Using an AWS Client VPN AWS Client VPN is a managed client-based VPN service that enables you to securely access your AWS resources and resources in your on-premises network. It provides
amazon-mwaa-user-guide-068
amazon-mwaa-user-guide.pdf
68
response: 192.0.5.1 192.0.6.1 Accessing the VPC endpoint for your Apache Airflow Web server (private network access) If you've chosen the Private network access mode for your Apache Airflow Web server, you'll need to create a mechanism to access the VPC interface endpoint for your Apache Airflow Web server. You must use the same Amazon VPC, VPC security group, and private subnets as your Amazon MWAA environment for these resources. Using an AWS Client VPN AWS Client VPN is a managed client-based VPN service that enables you to securely access your AWS resources and resources in your on-premises network. It provides a secure TLS connection from any location using the OpenVPN client. We recommend following the Amazon MWAA tutorial to configure a Client VPN: Tutorial: Configuring private network access using an AWS Client VPN. Accessing the VPC endpoint for your Apache Airflow Web server (private network access) 225 Amazon Managed Workflows for Apache Airflow Using a Linux Bastion Host User Guide A bastion host is a server whose purpose is to provide access to a private network from an external network, such as over the Internet from your computer. Linux instances are in a public subnet, and they are set up with a security group that allows SSH access from the security group attached to the underlying Amazon EC2 instance running the bastion host. We recommend following the Amazon MWAA tutorial to configure a Linux Bastion Host: Tutorial: Configuring private network access using a Linux Bastion Host. Using a Load Balancer (advanced) The following section shows the configurations you'll need to apply to an Application Load Balancer. 1. Target groups. You'll need to use target groups that point to the private IP addresses for your Apache Airflow Web server, and its VPC interface endpoint. We recommend specifying both private IP addresses as your registered targets, as using only one can reduce availability. For more information on how to identify the private IP addresses, see the section called “Identifying the private IP addresses of your Apache Airflow Web server and its VPC endpoint”. 2. Status codes. We recommend using 200 and 302 status codes in your target group settings. Otherwise, the targets may be flagged as unhealthy if the VPC endpoint for the Apache Airflow Web server responds with a 302 Redirect error. 3. HTTPS Listener. You'll need to specify the target port for the Apache Airflow Web server. For example: Protocol HTTPS Port 443 4. ACM new domain. If you want to associate an SSL/TLS certificate in AWS Certificate Manager, you'll need to create a new domain for the HTTPS listener for your load balancer. 5. ACM certificate region. If you want to associate an SSL/TLS certificate in AWS Certificate Manager, you'll need to upload to the same AWS Region as your environment. For example: Accessing the VPC endpoint for your Apache Airflow Web server (private network access) 226 Amazon Managed Workflows for Apache Airflow User Guide • Example region to upload certificate aws acm import-certificate --certificate fileb://Certificate.pem --certificate- chain fileb://CertificateChain.pem --private-key fileb://PrivateKey.pem -- region us-west-2 Creating the required VPC service endpoints in an Amazon VPC with private routing An existing Amazon VPC network without Internet access needs additional VPC service endpoints (AWS PrivateLink) to use Apache Airflow on Amazon Managed Workflows for Apache Airflow. This page describes the VPC endpoints required for the AWS services used by Amazon MWAA, the VPC endpoints required for Apache Airflow, and how to create and attach the VPC endpoints to an existing Amazon VPC with private routing. Contents • Pricing • Private network and private routing • (Required) VPC endpoints • Attaching the required VPC endpoints • VPC endpoints required for AWS services • VPC endpoints required for Apache Airflow • (Optional) Enable private IP addresses for your Amazon S3 VPC interface endpoint • Using Route 53 • VPCs with custom DNS Pricing • AWS PrivateLink Pricing VPC service endpoints in private Amazon VPCs 227 Amazon Managed Workflows for Apache Airflow User Guide Private network and private routing The private network access mode limits access to the Apache Airflow UI to users within your Amazon VPC that have been granted access to the IAM policy for your environment. When you create an environment with private web server access, you must package all of your dependencies in a Python wheel archive (.whl), then reference the .whl in your requirements.txt. For instructions on packaging and installing your dependencies using wheel, see Managing dependencies using Python wheel. The following image shows where to find the Private network option on the Amazon MWAA console. • Private routing. An Amazon VPC without Internet access limits network traffic within the VPC. This page assumes your Amazon VPC does not have Internet access and requires VPC endpoints Private network and private routing 228 Amazon Managed Workflows for Apache Airflow User Guide
amazon-mwaa-user-guide-069
amazon-mwaa-user-guide.pdf
69
private web server access, you must package all of your dependencies in a Python wheel archive (.whl), then reference the .whl in your requirements.txt. For instructions on packaging and installing your dependencies using wheel, see Managing dependencies using Python wheel. The following image shows where to find the Private network option on the Amazon MWAA console. • Private routing. An Amazon VPC without Internet access limits network traffic within the VPC. This page assumes your Amazon VPC does not have Internet access and requires VPC endpoints Private network and private routing 228 Amazon Managed Workflows for Apache Airflow User Guide for each AWS service used by your environment, and VPC endpoints for Apache Airflow in the same AWS Region and Amazon VPC as your Amazon MWAA environment. (Required) VPC endpoints The following section shows the required VPC endpoints needed for an Amazon VPC without Internet access. It lists the VPC endpoints for each AWS service used by Amazon MWAA, including the VPC endpoints needed for Apache Airflow. com.amazonaws.YOUR_REGION.s3 com.amazonaws.YOUR_REGION.monitoring com.amazonaws.YOUR_REGION.logs com.amazonaws.YOUR_REGION.sqs com.amazonaws.YOUR_REGION.kms Note When using Transit Gateway or any other routing that does not go directly to the AWS API endpoints, we recommend you to add AWS PrivateLink to your Amazon MWAA private subnets for the following services: • Amazon S3 • Amazon SQS • CloudWatch Logs • CloudWatch metrics • AWS KMS (if applicable) This ensures that your Amazon MWAA environment can securely and efficiently communicate with these services without routing traffic through the public internet, thereby improving security and performance. Attaching the required VPC endpoints This section describes the steps to attach the required VPC endpoints for an Amazon VPC with private routing. (Required) VPC endpoints 229 Amazon Managed Workflows for Apache Airflow User Guide VPC endpoints required for AWS services The following section shows the steps to attach the VPC endpoints for the AWS services used by an environment to an existing Amazon VPC. To attach VPC endpoints to your private subnets 1. Open the Endpoints page on the Amazon VPC console. 2. Use the AWS Region selector to select your region. 3. Create the endpoint for Amazon S3: a. b. Choose Create Endpoint. In the Filter by attributes or search by keyword text field, type: .s3, then press Enter on your keyboard. c. We recommend choosing the service endpoint listed for the Gateway type. For example, com.amazonaws.us-west-2.s3 amazon Gateway d. Choose your environment's Amazon VPC in VPC. e. Ensure that your two private subnets in different Availability Zones are selected, and that that private DNS is enabled by selecting Enable DNS name. f. Choose your environment's Amazon VPC security group(s). g. Choose Full Access in Policy. h. Choose Create endpoint. 4. Create the endpoint for CloudWatch Logs: a. b. Choose Create Endpoint. In the Filter by attributes or search by keyword text field, type: .logs, then press Enter on your keyboard. c. Select the service endpoint. d. Choose your environment's Amazon VPC in VPC. e. Ensure that your two private subnets in different Availability Zones are selected, and that Enable DNS name is enabled. f. Choose your environment's Amazon VPC security group(s). g. Choose Full Access in Policy. h. Choose Create endpoint. Attaching the required VPC endpoints 230 Amazon Managed Workflows for Apache Airflow User Guide 5. Create the endpoint for CloudWatch Monitoring: a. b. Choose Create Endpoint. In the Filter by attributes or search by keyword text field, type: .monitoring, then press Enter on your keyboard. c. Select the service endpoint. d. Choose your environment's Amazon VPC in VPC. e. Ensure that your two private subnets in different Availability Zones are selected, and that Enable DNS name is enabled. f. Choose your environment's Amazon VPC security group(s). g. Choose Full Access in Policy. h. Choose Create endpoint. 6. Create the endpoint for Amazon SQS: a. b. Choose Create Endpoint. In the Filter by attributes or search by keyword text field, type: .sqs, then press Enter on your keyboard. c. Select the service endpoint. d. Choose your environment's Amazon VPC in VPC. e. Ensure that your two private subnets in different Availability Zones are selected, and that Enable DNS name is enabled. f. Choose your environment's Amazon VPC security group(s). g. Choose Full Access in Policy. h. Choose Create endpoint. 7. Create the endpoint for AWS KMS: a. b. Choose Create Endpoint. In the Filter by attributes or search by keyword text field, type: .kms, then press Enter on your keyboard. c. Select the service endpoint. d. Choose your environment's Amazon VPC in VPC. e. Ensure that your two private subnets in different Availability Zones are selected, and that Enable DNS name is enabled. 231 Attaching the required VPC endpoints Amazon Managed Workflows for Apache Airflow User Guide f. Choose your environment's Amazon VPC security group(s). g. Choose Full Access in Policy. h. Choose Create endpoint.
amazon-mwaa-user-guide-070
amazon-mwaa-user-guide.pdf
70
h. Choose Create endpoint. 7. Create the endpoint for AWS KMS: a. b. Choose Create Endpoint. In the Filter by attributes or search by keyword text field, type: .kms, then press Enter on your keyboard. c. Select the service endpoint. d. Choose your environment's Amazon VPC in VPC. e. Ensure that your two private subnets in different Availability Zones are selected, and that Enable DNS name is enabled. 231 Attaching the required VPC endpoints Amazon Managed Workflows for Apache Airflow User Guide f. Choose your environment's Amazon VPC security group(s). g. Choose Full Access in Policy. h. Choose Create endpoint. VPC endpoints required for Apache Airflow The following section shows the steps to attach the VPC endpoints for Apache Airflow to an existing Amazon VPC. To attach VPC endpoints to your private subnets 1. Open the Endpoints page on the Amazon VPC console. 2. Use the AWS Region selector to select your region. 3. Create the endpoint for the Apache Airflow API: a. b. Choose Create Endpoint. In the Filter by attributes or search by keyword text field, type: .airflow.api, then press Enter on your keyboard. c. Select the service endpoint. d. Choose your environment's Amazon VPC in VPC. e. Ensure that your two private subnets in different Availability Zones are selected, and that Enable DNS name is enabled. f. Choose your environment's Amazon VPC security group(s). g. Choose Full Access in Policy. h. Choose Create endpoint. 4. Create the first endpoint for the Apache Airflow environment: a. b. Choose Create Endpoint. In the Filter by attributes or search by keyword text field, type: .airflow.env, then press Enter on your keyboard. c. Select the service endpoint. d. Choose your environment's Amazon VPC in VPC. e. Ensure that your two private subnets in different Availability Zones are selected, and that Enable DNS name is enabled. Attaching the required VPC endpoints 232 Amazon Managed Workflows for Apache Airflow User Guide f. Choose your environment's Amazon VPC security group(s). g. Choose Full Access in Policy. h. Choose Create endpoint. 5. Create the second endpoint for Apache Airflow operations: a. b. Choose Create Endpoint. In the Filter by attributes or search by keyword text field, type: .airflow.ops, then press Enter on your keyboard. c. Select the service endpoint. d. Choose your environment's Amazon VPC in VPC. e. Ensure that your two private subnets in different Availability Zones are selected, and that Enable DNS name is enabled. f. Choose your environment's Amazon VPC security group(s). g. Choose Full Access in Policy. h. Choose Create endpoint. (Optional) Enable private IP addresses for your Amazon S3 VPC interface endpoint Amazon S3 Interface endpoints don't support private DNS. The S3 endpoint requests still resolves to a public IP address. To resolve the S3 address to a private IP address, you need to add a private hosted zone in Route 53 for the S3 regional endpoint. Using Route 53 This section describes the steps to enable private IP addresses for an S3 Interface endpoint using Route 53. 1. Create a Private Hosted Zone for your Amazon S3 VPC interface endpoint (such as, s3.eu- west-1.amazonaws.com) and associate it with your Amazon VPC. 2. Create an ALIAS A record for your Amazon S3 VPC interface endpoint (such as, s3.eu- west-1.amazonaws.com) that resolves to your VPC Interface Endpoint DNS name. 3. Create an ALIAS A wildcard record for your Amazon S3 interface endpoint (such as, *.s3.eu- west-1.amazonaws.com) that resolves to the VPC Interface Endpoint DNS name. (Optional) Enable private IP addresses for your Amazon S3 VPC interface endpoint 233 Amazon Managed Workflows for Apache Airflow User Guide VPCs with custom DNS If your Amazon VPC uses custom DNS routing, you need to make the changes in your DNS resolver (not Route 53, typically an EC2 instance running a DNS server) by creating a CNAME record. For example: Name: s3.us-west-2.amazonaws.com Type: CNAME Value: *.vpce-0f67d23e37648915c-e2q2e2j3.s3.us-west-2.vpce.amazonaws.com Managing your own Amazon VPC endpoints on Amazon MWAA Amazon MWAA uses Amazon VPC endpoints to integrate with various AWS services necessary to set up an Apache Airflow environment. Managing your own endpoints has two primary use-cases: 1. It means you can create Apache Airflow environments in a shared Amazon VPC when you use an AWS Organizations to manage multiple AWS accounts and share resources. 2. It let's you use more restrictive access policies by narrowing down your permissions to the specific resources that use your endpoints. If you choose to manage your own VPC endpoints, you are responsible for creating your own endpoints for the environment RDS for PostgreSQL database, and for the environment web server. For more information about how Amazon MWAA deploys Apache Airflow in the cloud, see the Amazon MWAA architecture diagram. Creating an environment in a shared Amazon VPC If you use AWS Organizations to manage multiple AWS accounts that share resources, you can
amazon-mwaa-user-guide-071
amazon-mwaa-user-guide.pdf
71
multiple AWS accounts and share resources. 2. It let's you use more restrictive access policies by narrowing down your permissions to the specific resources that use your endpoints. If you choose to manage your own VPC endpoints, you are responsible for creating your own endpoints for the environment RDS for PostgreSQL database, and for the environment web server. For more information about how Amazon MWAA deploys Apache Airflow in the cloud, see the Amazon MWAA architecture diagram. Creating an environment in a shared Amazon VPC If you use AWS Organizations to manage multiple AWS accounts that share resources, you can use customer managed VPC endpoints with Amazon MWAA to share environment resources with another account in your organization. When you configure shared VPC access, the account that owns the main Amazon VPC (owner) shares the two private subnets required by Amazon MWAA with other accounts (participants) that belong to the same organization. Participant accounts that share those subnets can view, create, modify, and delete environments in the shared Amazon VPC. Managing your own Amazon VPC endpoints 234 Amazon Managed Workflows for Apache Airflow User Guide Assume you have an account, Owner, which acts as the Root account in the organization and owns the Amazon VPC resources, and a participant account, Participant, a member of the same organization. When Participant creates a new Amazon MWAA in Amazon VPC it shares with Owner, Amazon MWAA will first create the service VPC resources, then enter a PENDING state for up to 72 hours. After the environment status changes from CREATING to PENDING, a principal acting on behalf of Owner creates the required endpoints. To do this, Amazon MWAA lists the database and web server endpoint in the Amazon MWAA console. You can also call the GetEnvironment API action to get the service endpoints. Note If the Amazon VPC you use to share resources is a private Amazon VPC, you must still complete the steps described in the section called “Managing access to VPC endpoints”. The topic covers setting up a different set of Amazon VPC endpoints related to other AWS services that AWS integrates with, such as Amazon ECR, Amazon ECS, and Amazon SQS. These services are essential in operating, and managing, your Apache Airflow environment in the cloud. Prerequisites Before you create an Amazon MWAA environment in a shared VPC, you need the following resources: • An AWS account, Owner to be used as the account that owns the Amazon VPC. • An AWS Organizations organization unit, MyOrganization created as a root. • A second AWS account, Participant, under MyOrganization to serve the participant account that creates the new environment. In addition, we recommend that you familiarize yourself with the responsibilities and permissions for owners and participants when sharing resources in Amazon VPC. Create the Amazon VPC First, create a new Amazon VPC that the owner and participant accounts will share: Creating an environment in a shared Amazon VPC 235 Amazon Managed Workflows for Apache Airflow User Guide 1. Sign in to the console using Owner, then, open the AWS CloudFormation console. Use the following template to create a stack. This stack provisions a number of networking resources including a Amazon VPC, and the subnets that the two accounts will share in this scenario. AWSTemplateFormatVersion: "2010-09-09" Description: >- This template deploys a VPC, with a pair of public and private subnets spread across two Availability Zones. It deploys an internet gateway, with a default route on the public subnets. It deploys a pair of NAT gateways (one in each AZ), and default routes for them in the private subnets. Parameters: EnvironmentName: Description: An environment name that is prefixed to resource names Type: String Default: mwaa- VpcCIDR: Description: Please enter the IP range (CIDR notation) for this VPC Type: String Default: 10.192.0.0/16 PublicSubnet1CIDR: Description: >- Please enter the IP range (CIDR notation) for the public subnet in the first Availability Zone Type: String Default: 10.192.10.0/24 PublicSubnet2CIDR: Description: >- Please enter the IP range (CIDR notation) for the public subnet in the second Availability Zone Type: String Default: 10.192.11.0/24 PrivateSubnet1CIDR: Description: >- Please enter the IP range (CIDR notation) for the private subnet in the first Availability Zone Type: String Default: 10.192.20.0/24 PrivateSubnet2CIDR: Description: >- Please enter the IP range (CIDR notation) for the private subnet in the second Availability Zone Type: String Default: 10.192.21.0/24 Creating an environment in a shared Amazon VPC 236 Amazon Managed Workflows for Apache Airflow User Guide Resources: VPC: Type: 'AWS::EC2::VPC' Properties: CidrBlock: !Ref VpcCIDR EnableDnsSupport: true EnableDnsHostnames: true Tags: - Key: Name Value: !Ref EnvironmentName InternetGateway: Type: 'AWS::EC2::InternetGateway' Properties: Tags: - Key: Name Value: !Ref EnvironmentName InternetGatewayAttachment: Type: 'AWS::EC2::VPCGatewayAttachment' Properties: InternetGatewayId: !Ref InternetGateway VpcId: !Ref VPC PublicSubnet1: Type: 'AWS::EC2::Subnet' Properties: VpcId: !Ref VPC AvailabilityZone: !Select - 0 - !GetAZs '' CidrBlock: !Ref PublicSubnet1CIDR MapPublicIpOnLaunch: true Tags: - Key:
amazon-mwaa-user-guide-072
amazon-mwaa-user-guide.pdf
72
>- Please enter the IP range (CIDR notation) for the private subnet in the second Availability Zone Type: String Default: 10.192.21.0/24 Creating an environment in a shared Amazon VPC 236 Amazon Managed Workflows for Apache Airflow User Guide Resources: VPC: Type: 'AWS::EC2::VPC' Properties: CidrBlock: !Ref VpcCIDR EnableDnsSupport: true EnableDnsHostnames: true Tags: - Key: Name Value: !Ref EnvironmentName InternetGateway: Type: 'AWS::EC2::InternetGateway' Properties: Tags: - Key: Name Value: !Ref EnvironmentName InternetGatewayAttachment: Type: 'AWS::EC2::VPCGatewayAttachment' Properties: InternetGatewayId: !Ref InternetGateway VpcId: !Ref VPC PublicSubnet1: Type: 'AWS::EC2::Subnet' Properties: VpcId: !Ref VPC AvailabilityZone: !Select - 0 - !GetAZs '' CidrBlock: !Ref PublicSubnet1CIDR MapPublicIpOnLaunch: true Tags: - Key: Name Value: !Sub '${EnvironmentName} Public Subnet (AZ1)' PublicSubnet2: Type: 'AWS::EC2::Subnet' Properties: VpcId: !Ref VPC AvailabilityZone: !Select - 1 - !GetAZs '' CidrBlock: !Ref PublicSubnet2CIDR MapPublicIpOnLaunch: true Tags: - Key: Name Creating an environment in a shared Amazon VPC 237 Amazon Managed Workflows for Apache Airflow User Guide Value: !Sub '${EnvironmentName} Public Subnet (AZ2)' PrivateSubnet1: Type: 'AWS::EC2::Subnet' Properties: VpcId: !Ref VPC AvailabilityZone: !Select - 0 - !GetAZs '' CidrBlock: !Ref PrivateSubnet1CIDR MapPublicIpOnLaunch: false Tags: - Key: Name Value: !Sub '${EnvironmentName} Private Subnet (AZ1)' PrivateSubnet2: Type: 'AWS::EC2::Subnet' Properties: VpcId: !Ref VPC AvailabilityZone: !Select - 1 - !GetAZs '' CidrBlock: !Ref PrivateSubnet2CIDR MapPublicIpOnLaunch: false Tags: - Key: Name Value: !Sub '${EnvironmentName} Private Subnet (AZ2)' NatGateway1EIP: Type: 'AWS::EC2::EIP' DependsOn: InternetGatewayAttachment Properties: Domain: vpc NatGateway2EIP: Type: 'AWS::EC2::EIP' DependsOn: InternetGatewayAttachment Properties: Domain: vpc NatGateway1: Type: 'AWS::EC2::NatGateway' Properties: AllocationId: !GetAtt NatGateway1EIP.AllocationId SubnetId: !Ref PublicSubnet1 NatGateway2: Type: 'AWS::EC2::NatGateway' Properties: AllocationId: !GetAtt NatGateway2EIP.AllocationId Creating an environment in a shared Amazon VPC 238 Amazon Managed Workflows for Apache Airflow User Guide SubnetId: !Ref PublicSubnet2 PublicRouteTable: Type: 'AWS::EC2::RouteTable' Properties: VpcId: !Ref VPC Tags: - Key: Name Value: !Sub '${EnvironmentName} Public Routes' DefaultPublicRoute: Type: 'AWS::EC2::Route' DependsOn: InternetGatewayAttachment Properties: RouteTableId: !Ref PublicRouteTable DestinationCidrBlock: 0.0.0.0/0 GatewayId: !Ref InternetGateway PublicSubnet1RouteTableAssociation: Type: 'AWS::EC2::SubnetRouteTableAssociation' Properties: RouteTableId: !Ref PublicRouteTable SubnetId: !Ref PublicSubnet1 PublicSubnet2RouteTableAssociation: Type: 'AWS::EC2::SubnetRouteTableAssociation' Properties: RouteTableId: !Ref PublicRouteTable SubnetId: !Ref PublicSubnet2 PrivateRouteTable1: Type: 'AWS::EC2::RouteTable' Properties: VpcId: !Ref VPC Tags: - Key: Name Value: !Sub '${EnvironmentName} Private Routes (AZ1)' DefaultPrivateRoute1: Type: 'AWS::EC2::Route' Properties: RouteTableId: !Ref PrivateRouteTable1 DestinationCidrBlock: 0.0.0.0/0 NatGatewayId: !Ref NatGateway1 PrivateSubnet1RouteTableAssociation: Type: 'AWS::EC2::SubnetRouteTableAssociation' Properties: RouteTableId: !Ref PrivateRouteTable1 SubnetId: !Ref PrivateSubnet1 PrivateRouteTable2: Creating an environment in a shared Amazon VPC 239 Amazon Managed Workflows for Apache Airflow User Guide Type: 'AWS::EC2::RouteTable' Properties: VpcId: !Ref VPC Tags: - Key: Name Value: !Sub '${EnvironmentName} Private Routes (AZ2)' DefaultPrivateRoute2: Type: 'AWS::EC2::Route' Properties: RouteTableId: !Ref PrivateRouteTable2 DestinationCidrBlock: 0.0.0.0/0 NatGatewayId: !Ref NatGateway2 PrivateSubnet2RouteTableAssociation: Type: 'AWS::EC2::SubnetRouteTableAssociation' Properties: RouteTableId: !Ref PrivateRouteTable2 SubnetId: !Ref PrivateSubnet2 SecurityGroup: Type: 'AWS::EC2::SecurityGroup' Properties: GroupName: mwaa-security-group GroupDescription: Security group with a self-referencing inbound rule. VpcId: !Ref VPC SecurityGroupIngress: Type: 'AWS::EC2::SecurityGroupIngress' Properties: GroupId: !Ref SecurityGroup IpProtocol: '-1' SourceSecurityGroupId: !Ref SecurityGroup Outputs: VPC: Description: A reference to the created VPC Value: !Ref VPC PublicSubnets: Description: A list of the public subnets Value: !Join - ',' - - !Ref PublicSubnet1 - !Ref PublicSubnet2 PrivateSubnets: Description: A list of the private subnets Value: !Join - ',' - - !Ref PrivateSubnet1 Creating an environment in a shared Amazon VPC 240 Amazon Managed Workflows for Apache Airflow User Guide - !Ref PrivateSubnet2 PublicSubnet1: Description: A reference to the public subnet in the 1st Availability Zone Value: !Ref PublicSubnet1 PublicSubnet2: Description: A reference to the public subnet in the 2nd Availability Zone Value: !Ref PublicSubnet2 PrivateSubnet1: Description: A reference to the private subnet in the 1st Availability Zone Value: !Ref PrivateSubnet1 PrivateSubnet2: Description: A reference to the private subnet in the 2nd Availability Zone Value: !Ref PrivateSubnet2 SecurityGroupIngress: Description: Security group with self-referencing inbound rule Value: !Ref SecurityGroupIngress 2. After the new Amazon VPC resources have been provisioned, navigate to the AWS Resource Access Manager console, then choose Create resource share. 3. Choose the subnets you created in the first step from the list of available subnets you can share with Participant. Create the environment Complete the following steps to create an Amazon MWAA environment with customer-managed Amazon VPC endpoints. 1. Sign in using Participant, and open the Amazon MWAA console. Complete Step one: Specify details to specify an Amazon S3 bucket, a DAG folder, and dependencies for your new environment. For more information, see getting started. 2. On the Configure advanced settings page, under Networking, choose the subnets from the shared Amazon VPC. 3. Under Endpoint management choose CUSTOMER from the dropdown list. 4. Keep the default for the remaining options on the page, then, choose Create environment on the Review and create page. The environment begins in a CREATING state, then changes to PENDING. When the environment is PENDING, write down the Database endpoint service name and Web server endpoint service name (if you set up a private web server) using the console. Creating an environment in a shared Amazon VPC 241 Amazon Managed Workflows for Apache Airflow User Guide When you create a new environment using the Amazon MWAA console. Amazon MWAA creates a new
amazon-mwaa-user-guide-073
amazon-mwaa-user-guide.pdf
73
choose CUSTOMER from the dropdown list. 4. Keep the default for the remaining options on the page, then, choose Create environment on the Review and create page. The environment begins in a CREATING state, then changes to PENDING. When the environment is PENDING, write down the Database endpoint service name and Web server endpoint service name (if you set up a private web server) using the console. Creating an environment in a shared Amazon VPC 241 Amazon Managed Workflows for Apache Airflow User Guide When you create a new environment using the Amazon MWAA console. Amazon MWAA creates a new security group with the required inbound and outbound rules. Write down the security group ID. In the next section, Owner will use the service endpoints and the security group ID to create new Amazon VPC endpoints in the shared Amazon VPC. Create the Amazon VPC endpoints Complete the following steps to create the required Amazon VPC endpoints for your environment. 1. Sign in to the AWS Management Console using Owner, the open https:// console.aws.amazon.com/vpc/. 2. Choose Security groups from the left navigation panel, then create a new security group in the shared Amazon VPC using the following inbound, and outbound, rules: Type Protocol Source type Source Inbound All traffic All Outbound All traffic All All All Your environme nt security group 0.0.0.0/0 Warning The Owner account must set up a security group in the Owner account to allow traffic from the new environment to the shared Amazon VPC. You can do this by creating a new security group in Owner, or editing an existing one. 3. Choose Endpoints, then create new endpoints for the environment database and the web server (if in private mode) using the endpoint service names from the previous steps. Choose the shared Amazon VPC, the subnets you used for the environment, and the environment's security group. If successful, the environment will change from PENDING back to CREATING, then finally to AVAILABLE. When it is AVAILABLE, you can sign in to the Apache Airflow console. Creating an environment in a shared Amazon VPC 242 Amazon Managed Workflows for Apache Airflow User Guide Shared Amazon VPC Troubleshooting Use the following reference to resolve issues you encounter when creating environments in a shared Amazon VPC. Environment in CREATE_FAILED after PENDING status • Verify that Owner is sharing the subnets with Participant using AWS Resource Access Manager. • Verify that the Amazon VPC endpoints for the database and web server are created in the same subnets associated with the environment. • Verify that the security group used with your endpoints allows traffic from the security groups used for the environment. The Owner account creates rules that reference the security group in Participant as account-number/security-group-id:. Type Protocol Source type Source All traffic All All 123456789 012 /sg-0909e8 e81919 For more information, see Responsibilities and permissions for owners and participants Environment stuck in PENDING status Verify each VPC endpoint status to ensure it is Available. If you configure an environment with a private web server, you must also create an endpoint for the web server. If the environment is stuck in PENDING, this might indicate that the private web server endpoint is missing. Received The Vpc Endpoint Service 'vpce-service-name' does not exist error If you see the following error, verify that the account creating the endpoints in the Owner account that owns the shared VPC: ClientError: An error occurred (InvalidServiceName) when calling the CreateVpcEndpoint operation: The Vpc Endpoint Service 'vpce-service-name' does not exist Creating an environment in a shared Amazon VPC 243 Amazon Managed Workflows for Apache Airflow User Guide Tutorials for Amazon Managed Workflows for Apache Airflow This guide includes step-by-step tutorials to using and configuring an Amazon Managed Workflows for Apache Airflow environment. Topics • Tutorial: Configuring private network access using an AWS Client VPN • Tutorial: Configuring private network access using a Linux Bastion Host • Tutorial: Restricting an Amazon MWAA user's access to a subset of DAGs • Tutorial: Automate managing your own environment endpoints on Amazon MWAA Tutorial: Configuring private network access using an AWS Client VPN This tutorial walks you through the steps to create a VPN tunnel from your computer to the Apache Airflow Web server for your Amazon Managed Workflows for Apache Airflow environment. To connect to the Internet through a VPN tunnel, you'll first need to create a AWS Client VPN endpoint. Once set up, a Client VPN endpoint acts as a VPN server allowing a secure connection from your computer to the resources in your VPC. You'll then connect to the Client VPN from your computer using the AWS Client VPN for Desktop. Sections • Private network • Use cases • Before you begin • Objectives • (Optional) Step one: Identify your VPC, CIDR rules, and VPC security(s) • Step two: Create
amazon-mwaa-user-guide-074
amazon-mwaa-user-guide.pdf
74
server for your Amazon Managed Workflows for Apache Airflow environment. To connect to the Internet through a VPN tunnel, you'll first need to create a AWS Client VPN endpoint. Once set up, a Client VPN endpoint acts as a VPN server allowing a secure connection from your computer to the resources in your VPC. You'll then connect to the Client VPN from your computer using the AWS Client VPN for Desktop. Sections • Private network • Use cases • Before you begin • Objectives • (Optional) Step one: Identify your VPC, CIDR rules, and VPC security(s) • Step two: Create the server and client certificates • Step three: Save the AWS CloudFormation template locally • Step four: Create the Client VPN AWS CloudFormation stack • Step five: Associate subnets to your Client VPN Tutorial: AWS Client VPN 244 Amazon Managed Workflows for Apache Airflow User Guide • Step six: Add an authorization ingress rule to your Client VPN • Step seven: Download the Client VPN endpoint configuration file • Step eight: Connect to the AWS Client VPN • What's next? Private network This tutorial assumes you've chosen the Private network access mode for your Apache Airflow Web server. The private network access mode limits access to the Apache Airflow UI to users within your Amazon VPC that have been granted access to the IAM policy for your environment. When you create an environment with private web server access, you must package all of your dependencies in a Python wheel archive (.whl), then reference the .whl in your requirements.txt. For instructions on packaging and installing your dependencies using wheel, see Managing dependencies using Python wheel. The following image shows where to find the Private network option on the Amazon MWAA console. Private network 245 Amazon Managed Workflows for Apache Airflow User Guide Use cases You can use this tutorial before or after you've created an Amazon MWAA environment. You must use the same Amazon VPC, VPC security group(s), and private subnets as your environment. If you use this tutorial after you've created an Amazon MWAA environment, once you've completed the steps, you can return to the Amazon MWAA console and change your Apache Airflow Web server access mode to Private network. Before you begin 1. Check for user permissions. Be sure that your account in AWS Identity and Access Management (IAM) has sufficient permissions to create and manage VPC resources. 2. Use your Amazon MWAA VPC. This tutorial assumes that you are associating the Client VPN to an existing VPC. The Amazon VPC must be in the same AWS Region as an Amazon MWAA environment and have two private subnets. If you haven't created an Amazon VPC, use the AWS CloudFormation template in Option three: Creating an Amazon VPC network without Internet access. Objectives In this tutorial, you'll do the following: 1. Create a AWS Client VPN endpoint using a AWS CloudFormation template for an existing Amazon VPC. 2. Generate server and client certificates and keys, and then upload the server certificate and key to AWS Certificate Manager in the same AWS Region as an Amazon MWAA environment. 3. Download and modify a Client VPN endpoint configuration file for your Client VPN, and use the file to create a VPN profile to connect using the Client VPN for Desktop. Use cases 246 Amazon Managed Workflows for Apache Airflow User Guide (Optional) Step one: Identify your VPC, CIDR rules, and VPC security(s) The following section describes how to find IDs for your Amazon VPC, VPC security group, and a way to identify the CIDR rules you'll need to create your Client VPN in subsequent steps. Identify your CIDR rules The following section shows how to identify the CIDR rules, which you'll need to create your Client VPN. To identify the CIDR for your Client VPN 1. Open the Your Amazon VPCs page on the Amazon VPC console. 2. Use the region selector in the navigation bar to choose the same AWS Region as an Amazon MWAA environment. 3. Choose your Amazon VPC. 4. Assuming the CIDRs for your private subnets are: • Private Subnet 1: 10.192.10.0/24 • Private Subnet 2: 10.192.11.0/24 If the CIDR for your Amazon VPC is 10.192.0.0/16, then the Client IPv4 CIDR you'd specify for your Client VPN would be 10.192.0.0/22. 5. Save this CIDR value, and the value of your VPC ID for subsequent steps. Identify your VPC and security group(s) The following section shows how to find the ID of your Amazon VPC and security group(s), which you'll need to create your Client VPN. Note You may be using more than one security group. You'll need to specify all of your VPC's security groups in subsequent steps. (Optional) Step one: Identify your VPC, CIDR rules, and VPC security(s) 247 Amazon Managed Workflows for Apache Airflow To identify
amazon-mwaa-user-guide-075
amazon-mwaa-user-guide.pdf
75
Client IPv4 CIDR you'd specify for your Client VPN would be 10.192.0.0/22. 5. Save this CIDR value, and the value of your VPC ID for subsequent steps. Identify your VPC and security group(s) The following section shows how to find the ID of your Amazon VPC and security group(s), which you'll need to create your Client VPN. Note You may be using more than one security group. You'll need to specify all of your VPC's security groups in subsequent steps. (Optional) Step one: Identify your VPC, CIDR rules, and VPC security(s) 247 Amazon Managed Workflows for Apache Airflow To identify the security group(s) User Guide 1. Open the Security Groups page on the Amazon VPC console. 2. Use the region selector in the navigation bar to choose the AWS Region. 3. 4. Look for the Amazon VPC in VPC ID, and identify the security groups associated with the VPC. Save the ID of your security group(s) and VPC for subsequent steps. Step two: Create the server and client certificates A Client VPN endpoint supports 1024-bit and 2048-bit RSA key sizes only. The following section shows how to use OpenVPN easy-rsa to generate the server and client certificates and keys, and then upload the certificates to ACM using the AWS Command Line Interface (AWS CLI). To create the client certificates 1. Follow these quick steps to create and upload the certificates to ACM via the AWS CLI in Client authentication and authorization: Mutual authentication. 2. In these steps, you must specify the same AWS Region as an Amazon MWAA environment in the AWS CLI command when uploading your server and client certificates. Here's some examples of how to specify the region in these commands: a. Example region for server certificate aws acm import-certificate --certificate fileb://server.crt --private-key fileb://server.key --certificate-chain fileb://ca.crt --region us-west-2 b. Example region for client certificate aws acm import-certificate --certificate fileb://client1.domain.tld.crt --private-key fileb://client1.domain.tld.key --certificate-chain fileb:// ca.crt --region us-west-2 c. After these steps, save the value returned in the AWS CLI response for the server certificate and client certificate ARNs. You'll be specifying these ARNs in your AWS CloudFormation template to create the Client VPN. 3. In these steps, a client certificate and a private key are saved to your computer. Here's an example of where to find these credentials: Step two: Create the server and client certificates 248 Amazon Managed Workflows for Apache Airflow User Guide a. Example on macOS On macOS the contents are saved at /Users/youruser/custom_folder. If you list all (ls -a) contents of this directory, you should see something similar to the following: . .. ca.crt client1.domain.tld.crt client1.domain.tld.key server.crt server.key b. After these steps, save the contents or note the location of the client certificate in client1.domain.tld.crt, and the private key in client1.domain.tld.key. You'll be adding these values to the configuration file for your Client VPN. Step three: Save the AWS CloudFormation template locally The following section contains the AWS CloudFormation template to create the Client VPN. You must specify the same Amazon VPC, VPC security group(s), and private subnets as your Amazon MWAA environment. • Copy the contents of the following template and save locally as mwaa_vpn_client.yaml. You can also download the template. Substitute the following values: • YOUR_CLIENT_ROOT_CERTIFICATE_ARN – The ARN for your client1.domain.tld certificate in ClientRootCertificateChainArn. • YOUR_SERVER_CERTIFICATE_ARN – The ARN for your server certificate in ServerCertificateArn. • The Client IPv4 CIDR rule in ClientCidrBlock. A CIDR rule of 10.192.0.0/22 is provided. • Your Amazon VPC ID in VpcId. A VPC of vpc-010101010101 is provided. • Your VPC security group ID(s) in SecurityGroupIds. A security group of sg-0101010101 is provided. Step three: Save the AWS CloudFormation template locally 249 Amazon Managed Workflows for Apache Airflow User Guide AWSTemplateFormatVersion: 2010-09-09 Description: This template deploys a VPN Client Endpoint. Resources: ClientVpnEndpoint: Type: 'AWS::EC2::ClientVpnEndpoint' Properties: AuthenticationOptions: - Type: "certificate-authentication" MutualAuthentication: ClientRootCertificateChainArn: "YOUR_CLIENT_ROOT_CERTIFICATE_ARN" ClientCidrBlock: 10.192.0.0/22 ClientConnectOptions: Enabled: false ConnectionLogOptions: Enabled: false Description: "MWAA Client VPN" DnsServers: [] SecurityGroupIds: - sg-0101010101 SelfServicePortal: '' ServerCertificateArn: "YOUR_SERVER_CERTIFICATE_ARN" SplitTunnel: true TagSpecifications: - ResourceType: "client-vpn-endpoint" Tags: - Key: Name Value: MWAA-Client-VPN TransportProtocol: udp VpcId: vpc-010101010101 VpnPort: 443 Note If you're using more than one security group for your environment, you can specify multiple security groups in the following format: SecurityGroupIds: - sg-0112233445566778b Step three: Save the AWS CloudFormation template locally 250 Amazon Managed Workflows for Apache Airflow User Guide - sg-0223344556677889f Step four: Create the Client VPN AWS CloudFormation stack To create the AWS Client VPN 1. Open the AWS CloudFormation console. 2. Choose Template is ready, Upload a template file. 3. Choose Choose file, and select your mwaa_vpn_client.yaml file. 4. 5. Choose Next, Next. 6. Select the acknowledgement, and then choose Create stack. Step five: Associate subnets to your Client VPN To associate private subnets to the AWS Client VPN 1. Open the Amazon VPC console. 2. Choose the Client VPN Endpoints
amazon-mwaa-user-guide-076
amazon-mwaa-user-guide.pdf
76
three: Save the AWS CloudFormation template locally 250 Amazon Managed Workflows for Apache Airflow User Guide - sg-0223344556677889f Step four: Create the Client VPN AWS CloudFormation stack To create the AWS Client VPN 1. Open the AWS CloudFormation console. 2. Choose Template is ready, Upload a template file. 3. Choose Choose file, and select your mwaa_vpn_client.yaml file. 4. 5. Choose Next, Next. 6. Select the acknowledgement, and then choose Create stack. Step five: Associate subnets to your Client VPN To associate private subnets to the AWS Client VPN 1. Open the Amazon VPC console. 2. Choose the Client VPN Endpoints page. 3. Select your Client VPN, and then choose the Associations tab, Associate. 4. Choose the following in the dropdown list: • Your Amazon VPC in VPC. • One of your private subnets in Choose a subnet to associate. 5. Choose Associate. Note It takes several minutes for the VPC and subnet to be associated to the Client VPN. Step four: Create the Client VPN AWS CloudFormation stack 251 Amazon Managed Workflows for Apache Airflow User Guide Step six: Add an authorization ingress rule to your Client VPN You need to add an authorization ingress rule using the CIDR rule for your VPC to your Client VPN. If you want to authorize specific users or groups from your Active Directory Group or SAML-based Identity Provider (IdP), see the Authorization rules in the Client VPN guide. To add the CIDR to the AWS Client VPN 1. Open the Amazon VPC console. 2. Choose the Client VPN Endpoints page. 3. 4. Select your Client VPN, and then choose the Authorization tab, Authorize Ingress. Specify the following: • Your Amazon VPC's CIDR rule in Destination network to enable. For example: 10.192.0.0/16 • Choose Allow access to all users in Grant access to. • Enter a descriptive name in Description. 5. Choose Add Authorization rule. Note Depending on the networking components for your Amazon VPC, you may also need to this authorization ingress rule to your network access control list (NACL). Step seven: Download the Client VPN endpoint configuration file To download the configuration file 1. Follow these quick steps to download the Client VPN configuration file at Download the Client VPN endpoint configuration file. 2. In these steps, you're asked to prepend a string to your Client VPN endpoint DNS name. Here's an example: Step six: Add an authorization ingress rule to your Client VPN 252 Amazon Managed Workflows for Apache Airflow User Guide • Example endpoint DNS name If your Client VPN endpoint DNS name looks like this: remote cvpn-endpoint-0909091212aaee1.prod.clientvpn.us-west-1.amazonaws.com 443 You can add a string to identify your Client VPN endpoint like this: remote mwaavpn.cvpn-endpoint-0909091212aaee1.prod.clientvpn.us- west-1.amazonaws.com 443 3. In these steps, you're asked to add the contents of the client certificate between a new set of <cert></cert> tags and the contents of the private key between a new set of <key></key> tags. Here's an example: a. Open a command prompt and change directories to the location of your client certificate and private key. b. Example macOS client1.domain.tld.crt To show the contents of the client1.domain.tld.crt file on macOS, you can use cat client1.domain.tld.crt. Copy the value from terminal and paste in downloaded-client-config.ovpn like this: ZZZ1111dddaBBB -----END CERTIFICATE----- </ca> <cert> -----BEGIN CERTIFICATE----- YOUR client1.domain.tld.crt -----END CERTIFICATE----- </cert> c. Example macOS client1.domain.tld.key To show the contents of the client1.domain.tld.key, you can use cat client1.domain.tld.key. Copy the value from terminal and paste in downloaded-client-config.ovpn like this: Step seven: Download the Client VPN endpoint configuration file 253 Amazon Managed Workflows for Apache Airflow User Guide ZZZ1111dddaBBB -----END CERTIFICATE----- </ca> <cert> -----BEGIN CERTIFICATE----- YOUR client1.domain.tld.crt -----END CERTIFICATE----- </cert> <key> -----BEGIN CERTIFICATE----- YOUR client1.domain.tld.key -----END CERTIFICATE----- </key> Step eight: Connect to the AWS Client VPN The client for AWS Client VPN is provided free of charge. You can connect your computer directly to AWS Client VPN for an end-to-end VPN experience. To connect to the Client VPN 1. Download and install the AWS Client VPN for Desktop. 2. Open the AWS Client VPN. 3. Choose File, Managed profiles in the VPN client menu. 4. Choose Add profile, and then choose the downloaded-client-config.ovpn. 5. Enter a descriptive name in Display Name. 6. Choose Add profile, Done. 7. Choose Connect. After you connect to the Client VPN, you'll need to disconnect from other VPNs to view any of the resources in your Amazon VPC. Note You may need to quit the client, and start again before you're able to get connected. Step eight: Connect to the AWS Client VPN 254 Amazon Managed Workflows for Apache Airflow User Guide What's next? • Learn how to create an Amazon MWAA environment in Get started with Amazon Managed Workflows for Apache Airflow. You must create an environment in the same AWS Region as the Client VPN, and using the same VPC, private subnets,
amazon-mwaa-user-guide-077
amazon-mwaa-user-guide.pdf
77
After you connect to the Client VPN, you'll need to disconnect from other VPNs to view any of the resources in your Amazon VPC. Note You may need to quit the client, and start again before you're able to get connected. Step eight: Connect to the AWS Client VPN 254 Amazon Managed Workflows for Apache Airflow User Guide What's next? • Learn how to create an Amazon MWAA environment in Get started with Amazon Managed Workflows for Apache Airflow. You must create an environment in the same AWS Region as the Client VPN, and using the same VPC, private subnets, and security group as the Client VPN. Tutorial: Configuring private network access using a Linux Bastion Host This tutorial walks you through the steps to create an SSH tunnel from your computer to the to the Apache Airflow Web server for your Amazon Managed Workflows for Apache Airflow environment. It assumes you've already created an Amazon MWAA environment. Once set up, a Linux Bastion Host acts as a jump server allowing a secure connection from your computer to the resources in your VPC. You'll then use a SOCKS proxy management add-on to control the proxy settings in your browser to access your Apache Airflow UI. Sections • Private network • Use cases • Before you begin • Objectives • Step one: Create the bastion instance • Step two: Create the ssh tunnel • Step three: Configure the bastion security group as an inbound rule • Step four: Copy the Apache Airflow URL • Step five: Configure proxy settings • Step six: Open the Apache Airflow UI • What's next? Private network This tutorial assumes you've chosen the Private network access mode for your Apache Airflow Web server. What's next? 255 Amazon Managed Workflows for Apache Airflow User Guide The private network access mode limits access to the Apache Airflow UI to users within your Amazon VPC that have been granted access to the IAM policy for your environment. When you create an environment with private web server access, you must package all of your dependencies in a Python wheel archive (.whl), then reference the .whl in your requirements.txt. For instructions on packaging and installing your dependencies using wheel, see Managing dependencies using Python wheel. The following image shows where to find the Private network option on the Amazon MWAA console. Use cases You can use this tutorial after you've created an Amazon MWAA environment. You must use the same Amazon VPC, VPC security group(s), and public subnets as your environment. Use cases 256 Amazon Managed Workflows for Apache Airflow User Guide Before you begin 1. Check for user permissions. Be sure that your account in AWS Identity and Access Management (IAM) has sufficient permissions to create and manage VPC resources. 2. Use your Amazon MWAA VPC. This tutorial assumes that you are associating the bastion host to an existing VPC. The Amazon VPC must be in the same region as your Amazon MWAA environment and have two private subnets, as defined in Create the VPC network. 3. Create an SSH key. You need to create an Amazon EC2 SSH key (.pem) in the same Region as your Amazon MWAA environment to connect to the virtual servers. If you don't have an SSH key, see Create or import a key pair in the Amazon EC2 User Guide. Objectives In this tutorial, you'll do the following: 1. Create a Linux Bastion Host instance using a AWS CloudFormation template for an existing VPC. 2. Authorize inbound traffic to the bastion instance's security group using an ingress rule on port 22. 3. Authorize inbound traffic from an Amazon MWAA environment's security group to the bastion instance's security group. 4. Create an SSH tunnel to the bastion instance. 5. Install and configure the FoxyProxy add-on for the Firefox browser to view the Apache Airflow UI. Step one: Create the bastion instance The following section describes the steps to create the linux bastion instance using a AWS CloudFormation template for an existing VPC on the AWS CloudFormation console. To create the Linux Bastion Host 1. Open the Deploy Quick Start page on the AWS CloudFormation console. 2. Use the region selector in the navigation bar to choose the same AWS Region as your Amazon MWAA environment. Before you begin 257 Amazon Managed Workflows for Apache Airflow User Guide 3. Choose Next. 4. Type a name in the Stack name text field, such as mwaa-linux-bastion. 5. On the Parameters, Network configuration pane, choose the following options: a. Choose your Amazon MWAA environment's VPC ID. b. Choose your Amazon MWAA environment's Public subnet 1 ID. c. d. Choose your Amazon MWAA environment's Public subnet 2 ID. Enter the narrowest possible address range (for example, an internal CIDR range) in Allowed bastion external access CIDR. Note The simplest way
amazon-mwaa-user-guide-078
amazon-mwaa-user-guide.pdf
78
choose the same AWS Region as your Amazon MWAA environment. Before you begin 257 Amazon Managed Workflows for Apache Airflow User Guide 3. Choose Next. 4. Type a name in the Stack name text field, such as mwaa-linux-bastion. 5. On the Parameters, Network configuration pane, choose the following options: a. Choose your Amazon MWAA environment's VPC ID. b. Choose your Amazon MWAA environment's Public subnet 1 ID. c. d. Choose your Amazon MWAA environment's Public subnet 2 ID. Enter the narrowest possible address range (for example, an internal CIDR range) in Allowed bastion external access CIDR. Note The simplest way to identify a range is to use the same CIDR range as your public subnets. For example, the public subnets in the AWS CloudFormation template on the Create the VPC network page are 10.192.10.0/24 and 10.192.11.0/24. 6. On the Amazon EC2 configuration pane, choose the following: a. b. c. Choose your SSH key in the dropdown list in Key pair name. Enter a name in Bastion Host Name. Choose true for TCP forwarding. Warning TCP forwarding must be set to true in this step. Otherwise, you won't be able to create an SSH tunnel in the next step. 7. Choose Next, Next. 8. Select the acknowledgement, and then choose Create stack. To learn more about the architecture of your Linux Bastion Host, see Linux Bastion Hosts on the AWS Cloud: Architecture. Step one: Create the bastion instance 258 Amazon Managed Workflows for Apache Airflow User Guide Step two: Create the ssh tunnel The following steps describe how to create the ssh tunnel to your linux bastion. An SSH tunnel recieves the request from your local IP address to the linux bastion, which is why TCP forwarding for the linux bastion was set to true in previous steps. macOS/Linux To create a tunnel via command line 1. Open the Instances page on the Amazon EC2 console. 2. Choose an instance. 3. Copy the address in Public IPv4 DNS. For example, ec2-4-82-142-1.compute-1.amazonaws.com. 4. In your command prompt, navigate to the directory where your SSH key is stored. 5. Run the following command to connect to the bastion instance using ssh. Substitute the sample value with your SSH key name in mykeypair.pem. ssh -i mykeypair.pem -N -D 8157 ec2-user@YOUR_PUBLIC_IPV4_DNS Windows (PuTTY) To create a tunnel using PuTTY 1. Open the Instances page on the Amazon EC2 console. 2. Choose an instance. 3. Copy the address in Public IPv4 DNS. For example, ec2-4-82-142-1.compute-1.amazonaws.com. 4. Open PuTTY, select Session. 5. 6. Enter the host name in Host Name as ec2-user@YOUR_PUBLIC_IPV4_DNS and the port as 22. Expand the SSH tab, select Auth. In Private Key file for authentication, choose your local "ppk" file. 7. Under SSH, choose the Tunnels tab, and then select the Dynamic and Auto options. Step two: Create the ssh tunnel 259 Amazon Managed Workflows for Apache Airflow User Guide 8. In Source Port, add the 8157 port (or any other unused port), and then leave the Destination port blank. Choose Add. 9. Choose the Session tab and enter a session name. For example SSH Tunnel. 10. Choose Save, Open. Note You may need to enter a pass phrase for your public key. Note If you receive a Permission denied (publickey) error, we recommend using the AWSSupport-TroubleshootSSH tool, and choose Run this Automation (console) to troubleshoot your SSH setup. Step three: Configure the bastion security group as an inbound rule Access to the servers and regular internet access from the servers is allowed with a special maintenance security group attached to those servers. The following steps describe how to configure the bastion security group as an inbound source of traffic to an environment's VPC security group. 1. Open the Environments page on the Amazon MWAA console. 2. Choose an environment. 3. On the Networking pane, choose VPC security group. 4. Choose Edit inbound rules. 5. Choose Add rule. 6. Choose your VPC security group ID in the Source dropdown list. 7. Leave the remaining options blank, or set to their default values. 8. Choose Save rules. Step three: Configure the bastion security group as an inbound rule 260 Amazon Managed Workflows for Apache Airflow User Guide Step four: Copy the Apache Airflow URL The following steps describe how to open the Amazon MWAA console and copy the URL to the Apache Airflow UI. 1. Open the Environments page on the Amazon MWAA console. 2. Choose an environment. 3. Copy the URL in Airflow UI for subsequent steps. Step five: Configure proxy settings If you use an SSH tunnel with dynamic port forwarding, you must use a SOCKS proxy management add-on to control the proxy settings in your browser. For example, you can use the --proxy- server feature of Chromium to kick off a browser session, or use the FoxyProxy extension in the Mozilla
amazon-mwaa-user-guide-079
amazon-mwaa-user-guide.pdf
79
The following steps describe how to open the Amazon MWAA console and copy the URL to the Apache Airflow UI. 1. Open the Environments page on the Amazon MWAA console. 2. Choose an environment. 3. Copy the URL in Airflow UI for subsequent steps. Step five: Configure proxy settings If you use an SSH tunnel with dynamic port forwarding, you must use a SOCKS proxy management add-on to control the proxy settings in your browser. For example, you can use the --proxy- server feature of Chromium to kick off a browser session, or use the FoxyProxy extension in the Mozilla FireFox browser. Option one: Setup an SSH Tunnel using local port forwarding If you do not wish to use a SOCKS proxy, you can set up an SSH tunnel using local port forwarding. The following example command accesses the Amazon EC2 ResourceManager web interface by forwarding traffic on local port 8157. 1. Open a new command prompt window. 2. Type the following command to open an SSH tunnel. ssh -i mykeypair.pem -N -L 8157:YOUR_VPC_ENDPOINT_ID- vpce.YOUR_REGION.airflow.amazonaws.com:443 ubuntu@YOUR_PUBLIC_IPV4_DNS.YOUR_REGION.compute.amazonaws.com -L signifies the use of local port forwarding which allows you to specify a local port used to forward data to the identified remote port on the node's local web server. 3. Type http://localhost:8157/ in your browser. Note You may need to use https://localhost:8157/. Step four: Copy the Apache Airflow URL 261 Amazon Managed Workflows for Apache Airflow User Guide Option two: Proxies via command line Most web browsers allow you to configure proxies via a command line or configuration parameter. For example, with Chromium you can start the browser with the following command: chromium --proxy-server="socks5://localhost:8157" This starts a browser session which uses the ssh tunnel you created in previous steps to proxy its requests. You can open your Private Amazon MWAA environment URL (with https://) as follows: https://YOUR_VPC_ENDPOINT_ID-vpce.YOUR_REGION.airflow.amazonaws.com/home. Option three: Proxies using FoxyProxy for Mozilla Firefox The following example demonstrates a FoxyProxy Standard (version 7.5.1) configuration for Mozilla Firefox. FoxyProxy provides a set of proxy management tools. It lets you use a proxy server for URLs that match patterns corresponding to domains used by the Apache Airflow UI. 1. In Firefox, open the FoxyProxy Standard extension page. 2. Choose Add to Firefox. 3. Choose Add. 4. Choose the FoxyProxy icon in your browser's toolbar, choose Options. 5. Copy the following code and save locally as mwaa-proxy.json. Substitute the sample value in YOUR_HOST_NAME with your Apache Airflow URL. { "e0b7kh1606694837384": { "type": 3, "color": "#66cc66", "title": "airflow", "active": true, "address": "localhost", "port": 8157, "proxyDNS": false, "username": "", "password": "", "whitePatterns": [ { Step five: Configure proxy settings 262 Amazon Managed Workflows for Apache Airflow User Guide "title": "airflow-ui", "pattern": "YOUR_HOST_NAME", "type": 1, "protocols": 1, "active": true } ], "blackPatterns": [], "pacURL": "", "index": -1 }, "k20d21508277536715": { "active": true, "title": "Default", "notes": "These are the settings that are used when no patterns match a URL.", "color": "#0055E5", "type": 5, "whitePatterns": [ { "title": "all URLs", "active": true, "pattern": "*", "type": 1, "protocols": 1 } ], "blackPatterns": [], "index": 9007199254740991 }, "logging": { "active": true, "maxSize": 500 }, "mode": "patterns", "browserVersion": "82.0.3", "foxyProxyVersion": "7.5.1", "foxyProxyEdition": "standard" } 6. On the Import Settings from FoxyProxy 6.0+ pane, choose Import Settings and select the mwaa-proxy.json file. 7. Choose OK. Step five: Configure proxy settings 263 Amazon Managed Workflows for Apache Airflow User Guide Step six: Open the Apache Airflow UI The following steps describe how to open your Apache Airflow UI. 1. Open the Environments page on the Amazon MWAA console. 2. Choose Open Airflow UI. What's next? • Learn how to run Airflow CLI commands on an SSH tunnel to a bastion host in Apache Airflow CLI command reference. • Learn how to upload DAG code to your Amazon S3 bucket in Adding or updating DAGs. Tutorial: Restricting an Amazon MWAA user's access to a subset of DAGs Amazon MWAA manages access to your environment by mapping your IAM principals to one or more of Apache Airflow's default roles. The following tutorial shows how you can restrict individual Amazon MWAA users to only view and interact with a specific DAG or a set of DAGs. Note The steps in this tutorial can be completed using federated access, as long as the IAM roles can be assumed. Topics • Prerequisites • Step one: Provide Amazon MWAA web server access to your IAM principal with the default Public Apache Airflow role. • Step two: Create a new Apache Airflow custom role • Step three: Assign the role you created to your Amazon MWAA user • Next steps • Related resources Step six: Open the Apache Airflow UI 264 Amazon Managed Workflows for Apache Airflow User Guide Prerequisites To complete the steps in this tutorial, you'll need the following: • An Amazon MWAA environment with multiple DAGs • An
amazon-mwaa-user-guide-080
amazon-mwaa-user-guide.pdf
80
access, as long as the IAM roles can be assumed. Topics • Prerequisites • Step one: Provide Amazon MWAA web server access to your IAM principal with the default Public Apache Airflow role. • Step two: Create a new Apache Airflow custom role • Step three: Assign the role you created to your Amazon MWAA user • Next steps • Related resources Step six: Open the Apache Airflow UI 264 Amazon Managed Workflows for Apache Airflow User Guide Prerequisites To complete the steps in this tutorial, you'll need the following: • An Amazon MWAA environment with multiple DAGs • An IAM principal, Admin with AdministratorAccess permissions, and an IAM user, MWAAUser, as the principal for which you can limit DAG access. For more information about admin roles, see Administrator job function in the IAM User Guide Note Do not attach permission policies directly to your IAM users. We recommend setting up IAM roles that users can assume to gain temporary access to your Amazon MWAA resources. • AWS Command Line Interface version 2 installed. Step one: Provide Amazon MWAA web server access to your IAM principal with the default Public Apache Airflow role. To grant permission using the AWS Management Console 1. 2. Sign in to your AWS account with an Admin role and open the IAM console. In the left navigation pane, choose Users, then choose your Amazon MWAA IAM user from the users table. 3. On the user details page, under Summary, choose the Permissions tab, then choose Permissions policies to expand the card and choose Add permissions. 4. In the Grant permissions section, choose Attach existing policies directly, then choose Create policy to create and attach your own custom permissions policy. 5. On the Create policy page, choose JSON, then copy and paste the following JSON permissions policy in the policy editor. Tha policy grants web server access to the user with the default Public Apache Airflow role. { "Version": "2012-10-17", "Statement": [ { Prerequisites 265 Amazon Managed Workflows for Apache Airflow User Guide "Effect": "Allow", "Action": "airflow:CreateWebLoginToken", "Resource": [ "arn:aws:airflow:YOUR_REGION:YOUR_ACCOUNT_ID:role/YOUR_ENVIRONMENT_NAME/Public" ] } ] } Step two: Create a new Apache Airflow custom role To create a new role using the Apache Airflow UI 1. Using your administrator IAM role, open the Amazon MWAA console and launch your environment's Apache Airflow UI. 2. 3. From the navigation pane at the top, hover on Security to open the dropdown list, then choose List Roles to view the default Apache Airflow roles. From the roles list, select User, then at the top of the page choose Actions to open the dropdown. Choose Copy Role, and confirm Ok Note Copy the Ops or Viewer roles to grant more or less access, respectively. 4. Locate the new role you created in the table and choose Edit record. 5. On the Edit Role page, do the following: • For Name, type a new name for the role in the text field. For example, Restricted. • For the list of Permissions, remove can read on DAGs and can edit on DAGs, then add read and write permissions for the set of DAGs you want to provide access to. For example, for a DAG, example_dag.py, add can read on DAG:example_dag and can edit on DAG:example_dag. Choose Save. You should now have a new role that limits access to a subset of DAGs available in your Amazon MWAA environment. You can now assign this role to any existing Apache Airflow users. Step two: Create a new Apache Airflow custom role 266 Amazon Managed Workflows for Apache Airflow User Guide Step three: Assign the role you created to your Amazon MWAA user To assign the new role 1. Using access credentials for MWAAUser, run the following CLI command to retrieve your environment's web server URL. $ aws mwaa get-environment --name YOUR_ENVIRONMENT_NAME | jq '.Environment.WebserverUrl' If successful, you'll see the following output: "ab1b2345-678a-90a1-a2aa-34a567a8a901.c13.us-west-2.airflow.amazonaws.com" 2. With MWAAUser signed in to the AWS Management Console, open a new browser window and access the following URl. Replace Webserver-URL with your information. https://<Webserver-URL>/home If successful, you'll see a Forbidden error page because MWAAUser has not been granted permission to access the Apache Airflow UI yet. 3. With Admin signed in to the AWS Management Console, open the Amazon MWAA console again and launch your environment's Apache Airflow UI. 4. 5. From the UI dashboard, expand the Security dropdown, and this time choose List Users. In the users table, find the new Apache Airflow user and choose Edit record. The user's first name will match your IAM user name in the following pattern: user/mwaa-user. 6. On the Edit User page, in the Role section, add the new custom role you created, then choose Save. Note The Last Name field is required, but a space satisfies the requirement. The IAM Public principal grants
amazon-mwaa-user-guide-081
amazon-mwaa-user-guide.pdf
81
the AWS Management Console, open the Amazon MWAA console again and launch your environment's Apache Airflow UI. 4. 5. From the UI dashboard, expand the Security dropdown, and this time choose List Users. In the users table, find the new Apache Airflow user and choose Edit record. The user's first name will match your IAM user name in the following pattern: user/mwaa-user. 6. On the Edit User page, in the Role section, add the new custom role you created, then choose Save. Note The Last Name field is required, but a space satisfies the requirement. The IAM Public principal grants the MWAAUser permission to access the Apache Airflow UI, while the new role provides the additional permissions needed to see their DAGs. Step three: Assign the role you created to your Amazon MWAA user 267 Amazon Managed Workflows for Apache Airflow User Guide Important Any of the 5 default roles (such as Admin) not authorized by IAM which are added using the Apache Airflow UI will be removed on next user login. Next steps • To learn more about managing access to your Amazon MWAA environment, and to see sample JSON IAM policies you can use for your environment users, see the section called “Accessing an Amazon MWAA environment” Related resources • Access Control (Apache Airflow Documentation) – Learn more about the default Apache Airflow roles on the Apache Airflow documentation website. Tutorial: Automate managing your own environment endpoints on Amazon MWAA If you use AWS Organizations to manage multiple AWS accounts that share resources, Amazon MWAA lets you create, and manage, your own Amazon VPC endpoints. This means you can use stricter security policies that allow access only the resources required by your environment. When you create an environment in a shared Amazon VPC, the account that owns the main Amazon VPC (owner) shares the two private subnets required by Amazon MWAA with other accounts (participants) that belong to the same organization. Participant accounts that share those subnets can then view, create, modify, and delete environments in the shared VPC. When you create an environment in a shared, or otherwise policy-restricted, Amazon VPC, Amazon MWAA will first create the service VPC resources, then enter a PENDING state for up to 72 hours. When the environment status changes from CREATING to PENDING, Amazon MWAA sends an Amazon EventBridge notification of the change in state. This lets the owner account create the required endpoints on behalf of participants based on endpoint service information from the Amazon MWAA console or API, or programmatically In the following, we create new Amazon VPC Next steps 268 Amazon Managed Workflows for Apache Airflow User Guide endpoints using an Lambda function and an EventBridge rule that listens to Amazon MWAA state change notifications. Here, we create the new endpoints in the same Amazon VPC as the environment. To set up a shared Amazon VPC, create the EventBridge rule and Lambda function would in the owner account, and the Amazon MWAA environment in the participant account. Topics • Prerequisites • Create the Amazon VPC • Create the Lambda function • Create the EventBridge rule • Create the Amazon MWAA environment Prerequisites To complete the steps in this tutorial, you will need the following: • ... Create the Amazon VPC Use the following AWS CloudFormation template and AWS CLI command to create a new Amazon VPC. The template sets up the Amazon VPC resources and modifies the endpoint policy to restrict access to a specific queue. 1. Download the AWS CloudFormation template, then unzip the .yml file. 2. In a new command prompt window, navigate to the folder where you saved the template, then use create-stack to create the stack. The --template-body flag specifies the path to the template. $ aws cloudformation create-stack --stack-name stack-name --template-body file:// cfn-vpc-private-network.yml In the next section, you'll create the Lambda function. Prerequisites 269 Amazon Managed Workflows for Apache Airflow User Guide Create the Lambda function Use the following Python code and IAM JSON policy to create a new Lambda function and execution role. This function creates Amazon VPC endpoints for a private Apache Airflow web server and an Amazon SQS queue. Amazon MWAA uses Amazon SQS to queue tasks with Celery among multiple workers when scaling your environment. 1. Download the Python function code. 2. Download the IAM permission policy, then unzip the file. 3. Open a command prompt, then navigate to the folder where you saved the JSON permission policy. Use the IAM create-role command to create the new role. $ aws iam create-role --role-name function-role \ --assume-role-policy-document file://lambda-mwaa-vpce-policy.json Note the role ARN from the AWS CLI response. In the next step, we specify this new role as the function's execution role using its ARN. 4. Navigate to the folder where you saved the function code, then use thecreate-function command to
amazon-mwaa-user-guide-082
amazon-mwaa-user-guide.pdf
82
multiple workers when scaling your environment. 1. Download the Python function code. 2. Download the IAM permission policy, then unzip the file. 3. Open a command prompt, then navigate to the folder where you saved the JSON permission policy. Use the IAM create-role command to create the new role. $ aws iam create-role --role-name function-role \ --assume-role-policy-document file://lambda-mwaa-vpce-policy.json Note the role ARN from the AWS CLI response. In the next step, we specify this new role as the function's execution role using its ARN. 4. Navigate to the folder where you saved the function code, then use thecreate-function command to create a new function. $ aws lambda create-function --function-name mwaa-vpce-lambda \ --zip-file file://mwaa-lambda-shared-vpc.zip --runtime python3.8 --role arn:aws:iam::123456789012:role/function-role --handler lambda_handler Note the function ARN from the AWS CLI response. In the next step we specify the ARN to configure the function as a target for a new EventBridge rule. In the next section, you will create the EventBridge rule that invokes this function when the environment enters a PENDING state. Create the EventBridge rule Do the following to create a new rule that listens for Amazon MWAA notifications and targets your new Lambda function. 1. Use the EventBridge put-rule command to create a new EventBridge rule. $ aws events put-rule --name "mwaa-lambda-rule" \ Create the Lambda function 270 Amazon Managed Workflows for Apache Airflow User Guide --event-pattern "{\"source\":[\"aws.airflow\"],\"detail-type\":[\"MWAA Environment Status Change\"]}" The event pattern listens for notifications that Amazon MWAA sends whenever an environment status changes. { "source": ["aws.airflow"], "detail-type": ["MWAA Environment Status Change"] } 2. Use the put-targets command to add the Lambda function as a target for the new rule. $ aws events put-targets --rule "mwaa-lambda-rule" \ --targets "Id"="1","Arn"="arn:aws::lambda:region:123456789012:function:mwaa-vpce- lambda" You're ready to create a new Amazon MWAA environment with customer-managed Amazon VPC endpoints. Create the Amazon MWAA environment Use the Amazon MWAA console to create a new environment with customer-managed Amazon VPC endpoints. 1. Open the Amazon MWAA console, and choose Create an environment. 2. 3. For Name enter a unique name. For Airflow version choose the latest version. 4. Choose an Amazon S3 bucket and a DAGs folder, such as dags/ to use with the environment, then choose Next. 5. On the Configure advanced settings page, do the following: a. b. c. For Virtual Private Cloud, choose the Amazon VPC you created in the previous step. For Web server access, choose Public network (Internet accessible). For Security groups, choose the security group you created with AWS CloudFormation. Because the security groups for the AWS PrivateLink endpoints from the earlier step are self-referencing, you must choose the same security group for your environment. Create the environment 271 Amazon Managed Workflows for Apache Airflow User Guide d. For Endpoint management, choose Customer managed endpoints. 6. Keep the remaining default settings, then choose Next. 7. Review your selections, then choose Create environment. Tip For more information about setting up a new environment, see Getting started with Amazon MWAA. When the environment is PENDING, Amazon MWAA sends a notification that matches the event pattern you set for your rule. The rule invokes your Lambda function. The function parses the notification event and gets the required endpoint information for the web server and the Amazon SQS queue. It then creates the endpoints in your Amazon VPC. When the endpoints are available, Amazon MWAA resumes creating your environment. When ready, the environment status changes to AVAILABLE and you can access the Apache Airflow web server using the Amazon MWAA console. Create the environment 272 Amazon Managed Workflows for Apache Airflow User Guide Code examples for Amazon Managed Workflows for Apache Airflow This guide contains code samples, including DAGs and custom plugins, that you can use on an Amazon Managed Workflows for Apache Airflow environment. For more examples of using Apache Airflow with AWS services, see the dags directory in the Apache Airflow GitHub repository. Samples • Using a DAG to import variables in the CLI • Creating an SSH connection using the SSHOperator • Using a secret key in AWS Secrets Manager for an Apache Airflow Snowflake connection • Using a DAG to write custom metrics in CloudWatch • Aurora PostgreSQL database cleanup on an Amazon MWAA environment • Exporting environment metadata to CSV files on Amazon S3 • Using a secret key in AWS Secrets Manager for an Apache Airflow variable • Using a secret key in AWS Secrets Manager for an Apache Airflow connection • Creating a custom plugin with Oracle • Creating a custom plugin that generates runtime environment variables • Changing a DAG's timezone on Amazon MWAA • Refreshing a CodeArtifact token • Creating a custom plugin with Apache Hive and Hadoop • Creating a custom plugin for Apache Airflow PythonVirtualenvOperator • Invoking DAGs with a Lambda function • Invoking DAGs in different Amazon MWAA environments • Using Amazon
amazon-mwaa-user-guide-083
amazon-mwaa-user-guide.pdf
83
CSV files on Amazon S3 • Using a secret key in AWS Secrets Manager for an Apache Airflow variable • Using a secret key in AWS Secrets Manager for an Apache Airflow connection • Creating a custom plugin with Oracle • Creating a custom plugin that generates runtime environment variables • Changing a DAG's timezone on Amazon MWAA • Refreshing a CodeArtifact token • Creating a custom plugin with Apache Hive and Hadoop • Creating a custom plugin for Apache Airflow PythonVirtualenvOperator • Invoking DAGs with a Lambda function • Invoking DAGs in different Amazon MWAA environments • Using Amazon MWAA with Amazon RDS for Microsoft SQL Server • Using Amazon MWAA with Amazon EMR • Using Amazon MWAA with Amazon EKS • Connecting to Amazon ECS using the ECSOperator • Using dbt with Amazon MWAA • AWS blogs and tutorials 273 Amazon Managed Workflows for Apache Airflow User Guide Using a DAG to import variables in the CLI The following sample code imports variables using the CLI on Amazon Managed Workflows for Apache Airflow. Topics • Version • Prerequisites • Permissions • Dependencies • Code sample • What's next? Version • You can use the code example on this page with Apache Airflow v2 in Python 3.10. Prerequisites • No additional permissions are required to use the code example on this page. Permissions Your AWS account needs access to the AmazonMWAAAirflowCliAccess policy. To learn more, see Apache Airflow CLI policy: AmazonMWAAAirflowCliAccess. Dependencies • To use this code example with Apache Airflow v2, no additional dependencies are required. The code uses the Apache Airflow v2 base install on your environment. Import variables DAG 274 Amazon Managed Workflows for Apache Airflow User Guide Code sample The following sample code takes three inputs: your Amazon MWAA environment name (in mwaa_env), the AWS Region of your environment (in aws_region), and the local file that contains the variables you want to import (in var_file). import boto3 import json import requests import base64 import getopt import sys argv = sys.argv[1:] mwaa_env='' aws_region='' var_file='' try: opts, args = getopt.getopt(argv, 'e:v:r:', ['environment', 'variable- file','region']) #if len(opts) == 0 and len(opts) > 3: if len(opts) != 3: print ('Usage: -e MWAA environment -v variable file location and filename -r aws region') else: for opt, arg in opts: if opt in ("-e"): mwaa_env=arg elif opt in ("-r"): aws_region=arg elif opt in ("-v"): var_file=arg boto3.setup_default_session(region_name="{}".format(aws_region)) mwaa_env_name = "{}".format(mwaa_env) client = boto3.client('mwaa') mwaa_cli_token = client.create_cli_token( Name=mwaa_env_name ) with open ("{}".format(var_file), "r") as myfile: Code sample 275 Amazon Managed Workflows for Apache Airflow User Guide fileconf = myfile.read().replace('\n', '') json_dictionary = json.loads(fileconf) for key in json_dictionary: print(key, " ", json_dictionary[key]) val = (key + " " + json_dictionary[key]) mwaa_auth_token = 'Bearer ' + mwaa_cli_token['CliToken'] mwaa_webserver_hostname = 'https://{0}/aws_mwaa/ cli'.format(mwaa_cli_token['WebServerHostname']) raw_data = "variables set {0}".format(val) mwaa_response = requests.post( mwaa_webserver_hostname, headers={ 'Authorization': mwaa_auth_token, 'Content-Type': 'text/plain' }, data=raw_data ) mwaa_std_err_message = base64.b64decode(mwaa_response.json() ['stderr']).decode('utf8') mwaa_std_out_message = base64.b64decode(mwaa_response.json() ['stdout']).decode('utf8') print(mwaa_response.status_code) print(mwaa_std_err_message) print(mwaa_std_out_message) except: print('Use this script with the following options: -e MWAA environment -v variable file location and filename -r aws region') print("Unexpected error:", sys.exc_info()[0]) sys.exit(2) What's next? • Learn how to upload the DAG code in this example to the dags folder in your Amazon S3 bucket in Adding or updating DAGs. Creating an SSH connection using the SSHOperator The following example describes how you can use the SSHOperator in a directed acyclic graph (DAG) to connect to a remote Amazon EC2 instance from your Amazon Managed Workflows for What's next? 276 Amazon Managed Workflows for Apache Airflow User Guide Apache Airflow environment. You can use a similar approach to connect to any remote instance with SSH access. In the following example, you upload a SSH secret key (.pem) to your environment's dags directory on Amazon S3. Then, you install the necessary dependencies using requirements.txt and create a new Apache Airflow connection in the UI. Finally, you write a DAG that creates an SSH connection to the remote instance. Topics • Version • Prerequisites • Permissions • Requirements • Copy your secret key to Amazon S3 • Create a new Apache Airflow connection • Code sample Version • You can use the code example on this page with Apache Airflow v2 in Python 3.10. Prerequisites To use the sample code on this page, you'll need the following: • An Amazon MWAA environment. • An SSH secret key. The code sample assumes you have an Amazon EC2 instance and a .pem in the same Region as your Amazon MWAA environment. If you don't have a key, see Create or import a key pair in the Amazon EC2 User Guide. Permissions • No additional permissions are required to use the code example on this page. Version 277 Amazon Managed Workflows for Apache Airflow User Guide Requirements Add the following parameter to requirements.txt to install the apache-airflow- providers-ssh package on
amazon-mwaa-user-guide-084
amazon-mwaa-user-guide.pdf
84
the sample code on this page, you'll need the following: • An Amazon MWAA environment. • An SSH secret key. The code sample assumes you have an Amazon EC2 instance and a .pem in the same Region as your Amazon MWAA environment. If you don't have a key, see Create or import a key pair in the Amazon EC2 User Guide. Permissions • No additional permissions are required to use the code example on this page. Version 277 Amazon Managed Workflows for Apache Airflow User Guide Requirements Add the following parameter to requirements.txt to install the apache-airflow- providers-ssh package on the web server. Once your environment updates and Amazon MWAA successfully installs the dependency, you will see a new SSH connection type in the UI. -c https://raw.githubusercontent.com/apache/airflow/constraints-Airflow-version/ constraints-Python-version.txt apache-airflow-providers-ssh Note -c defines the constraints URL in requirements.txt. This ensures that Amazon MWAA installs the correct package version for your environemnt. Copy your secret key to Amazon S3 Use the following AWS Command Line Interface command to copy your .pem key to your environment's dags directory in Amazon S3. $ aws s3 cp your-secret-key.pem s3://your-bucket/dags/ Amazon MWAA copies the content in dags, including the .pem key, to the local /usr/local/ airflow/dags/ directory, By doing this, Apache Airflow can access the key. Create a new Apache Airflow connection To create a new SSH connection using the Apache Airflow UI 1. Open the Environments page on the Amazon MWAA console. 2. From the list of environments, choose Open Airflow UI for your environment. 3. On the Apache Airflow UI page, choose Admin from the top navigation bar to expand the dropdown list, then choose Connections. 4. On the List Connections page, choose +, or Add a new record button to add a new connection. 5. On the Add Connection page, add the following information: Requirements 278 Amazon Managed Workflows for Apache Airflow User Guide a. b. c. d. For Connection Id, enter ssh_new. For Connection Type, choose SSH from the dropdown list. Note If the SSH connection type is not available in the list, Amazon MWAA hasn't installed the required apache-airflow-providers-ssh package. Update your requirements.txt file to include this package, then try again. For Host, enter the IP address for the Amazon EC2 instance that you want to connect to. For example, 12.345.67.89. For Username, enter ec2-user if you are connecting to an Amazon EC2 instance. Your username might be different, depending on the type of remote instance you want Apache Airflow to connect to. e. For Extra, enter the following key-value pair in JSON format: { "key_file": "/usr/local/airflow/dags/your-secret-key.pem" } This key-value pair instructs Apache Airflow to look for the secret key in the local /dags directory. Code sample The following DAG uses the SSHOperator to connect to your target Amazon EC2 instance, then runs the hostname Linux command to print the name of the instance. You can modify the DAG to run any command or script on the remote instance. 1. Open a terminal, and navigate to the directory where your DAG code is stored. For example: cd dags 2. Copy the contents of the following code sample and save locally as ssh.py. from airflow.decorators import dag from datetime import datetime from airflow.providers.ssh.operators.ssh import SSHOperator Code sample 279 Amazon Managed Workflows for Apache Airflow User Guide @dag( dag_id="ssh_operator_example", schedule_interval=None, start_date=datetime(2022, 1, 1), catchup=False, ) def ssh_dag(): task_1=SSHOperator( task_id="ssh_task", ssh_conn_id='ssh_new', command='hostname', ) my_ssh_dag = ssh_dag() 3. Run the following AWS CLI command to copy the DAG to your environment's bucket, then trigger the DAG using the Apache Airflow UI. $ aws s3 cp your-dag.py s3://your-environment-bucket/dags/ 4. If successful, you'll see output similar to the following in the task logs for ssh_task in the ssh_operator_example DAG: [2022-01-01, 12:00:00 UTC] {{base.py:79}} INFO - Using connection to: id: ssh_new. Host: 12.345.67.89, Port: None, Schema: , Login: ec2-user, Password: None, extra: {'key_file': '/usr/local/airflow/ dags/your-secret-key.pem'} [2022-01-01, 12:00:00 UTC] {{ssh.py:264}} WARNING - Remote Identification Change is not verified. This won't protect against Man-In-The-Middle attacks [2022-01-01, 12:00:00 UTC] {{ssh.py:270}} WARNING - No Host Key Verification. This won't protect against Man-In-The-Middle attacks [2022-01-01, 12:00:00 UTC] {{transport.py:1819}} INFO - Connected (version 2.0, client OpenSSH_7.4) [2022-01-01, 12:00:00 UTC] {{transport.py:1819}} INFO - Authentication (publickey) successful! [2022-01-01, 12:00:00 UTC] {{ssh.py:139}} INFO - Running command: hostname [2022-01-01, 12:00:00 UTC]{{ssh.py:171}} INFO - ip-123-45-67-89.us- west-2.compute.internal [2022-01-01, 12:00:00 UTC] {{taskinstance.py:1280}} INFO - Marking task as SUCCESS. dag_id=ssh_operator_example, task_id=ssh_task, execution_date=20220712T200914, start_date=20220712T200915, end_date=20220712T200916 Code sample 280 Amazon Managed Workflows for Apache Airflow User Guide Using a secret key in AWS Secrets Manager for an Apache Airflow Snowflake connection The following sample calls AWS Secrets Manager to get a secret key for an Apache Airflow Snowflake connection on Amazon Managed Workflows for Apache Airflow. It assumes you've completed the steps in Configuring an Apache Airflow connection using a AWS Secrets Manager secret. Topics • Version • Prerequisites • Permissions • Requirements •
amazon-mwaa-user-guide-085
amazon-mwaa-user-guide.pdf
85
[2022-01-01, 12:00:00 UTC]{{ssh.py:171}} INFO - ip-123-45-67-89.us- west-2.compute.internal [2022-01-01, 12:00:00 UTC] {{taskinstance.py:1280}} INFO - Marking task as SUCCESS. dag_id=ssh_operator_example, task_id=ssh_task, execution_date=20220712T200914, start_date=20220712T200915, end_date=20220712T200916 Code sample 280 Amazon Managed Workflows for Apache Airflow User Guide Using a secret key in AWS Secrets Manager for an Apache Airflow Snowflake connection The following sample calls AWS Secrets Manager to get a secret key for an Apache Airflow Snowflake connection on Amazon Managed Workflows for Apache Airflow. It assumes you've completed the steps in Configuring an Apache Airflow connection using a AWS Secrets Manager secret. Topics • Version • Prerequisites • Permissions • Requirements • Code sample • What's next? Version • You can use the code example on this page with Apache Airflow v2 in Python 3.10. Prerequisites To use the sample code on this page, you'll need the following: • The Secrets Manager backend as an Apache Airflow configuration option as shown in Configuring an Apache Airflow connection using a AWS Secrets Manager secret. • An Apache Airflow connection string in Secrets Manager as shown in Configuring an Apache Airflow connection using a AWS Secrets Manager secret. Permissions • Secrets Manager permissions as shown in Configuring an Apache Airflow connection using a AWS Secrets Manager secret. Apache Airflow Snowflake connection in Secrets Manager 281 Amazon Managed Workflows for Apache Airflow User Guide Requirements To use the sample code on this page, add the following dependencies to your requirements.txt. To learn more, see Installing Python dependencies. apache-airflow-providers-snowflake==1.3.0 Code sample The following steps describe how to create the DAG code that calls Secrets Manager to get the secret. 1. In your command prompt, navigate to the directory where your DAG code is stored. For example: cd dags 2. Copy the contents of the following code sample and save locally as snowflake_connection.py. """ Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. """ from airflow import DAG from airflow.providers.snowflake.operators.snowflake import SnowflakeOperator from airflow.utils.dates import days_ago snowflake_query = [ Requirements 282 Amazon Managed Workflows for Apache Airflow User Guide """use warehouse "MY_WAREHOUSE";""", """select * from "SNOWFLAKE_SAMPLE_DATA"."WEATHER"."WEATHER_14_TOTAL" limit 100;""", ] with DAG(dag_id='snowflake_test', schedule_interval=None, catchup=False, start_date=days_ago(1)) as dag: snowflake_select = SnowflakeOperator( task_id="snowflake_select", sql=snowflake_query, snowflake_conn_id="snowflake_conn", ) What's next? • Learn how to upload the DAG code in this example to the dags folder in your Amazon S3 bucket in Adding or updating DAGs. Using a DAG to write custom metrics in CloudWatch You can use the following code example to write a directed acyclic graph (DAG) that runs a PythonOperator to retrieve OS-level metrics for an Amazon MWAA environment. The DAG then publishes the data as custom metrics to Amazon CloudWatch. Custom OS-level metrics provide you with additional visibility about how your environment workers are utilizing resources such as virtual memory and CPU. You can use this information to select the environment class that best suits your workload. Topics • Version • Prerequisites • Permissions • Dependencies • Code example What's next? 283 Amazon Managed Workflows for Apache Airflow User Guide Version • You can use the code example on this page with Apache Airflow v2 in Python 3.10. Prerequisites To use the code example on this page, you need the following: • An Amazon MWAA environment. Permissions • No additional permissions are required to use the code example on this page. Dependencies • No additional dependencies are required to use the code example on this page. Code example 1. In your command prompt, navigate to the folder where your DAG code is stored. For example: cd dags 2. Copy the contents of the following code example and save it locally as dag-custom- metrics.py. Replace MWAA-ENV-NAME with your environment name. from airflow import DAG from airflow.operators.python_operator import PythonOperator from airflow.utils.dates import days_ago from datetime import datetime import os,json,boto3,psutil,socket def publish_metric(client,name,value,cat,unit='None'): environment_name = os.getenv("MWAA_ENV_NAME") value_number=float(value) hostname = socket.gethostname() ip_address = socket.gethostbyname(hostname) Version 284 Amazon Managed Workflows for Apache Airflow User Guide print('writing value',value_number,'to metric',name) response = client.put_metric_data( Namespace='MWAA-Custom', MetricData=[ { 'MetricName': name, 'Dimensions': [ { 'Name': 'Environment', 'Value': environment_name }, { 'Name': 'Category', 'Value':
amazon-mwaa-user-guide-086
amazon-mwaa-user-guide.pdf
86
prompt, navigate to the folder where your DAG code is stored. For example: cd dags 2. Copy the contents of the following code example and save it locally as dag-custom- metrics.py. Replace MWAA-ENV-NAME with your environment name. from airflow import DAG from airflow.operators.python_operator import PythonOperator from airflow.utils.dates import days_ago from datetime import datetime import os,json,boto3,psutil,socket def publish_metric(client,name,value,cat,unit='None'): environment_name = os.getenv("MWAA_ENV_NAME") value_number=float(value) hostname = socket.gethostname() ip_address = socket.gethostbyname(hostname) Version 284 Amazon Managed Workflows for Apache Airflow User Guide print('writing value',value_number,'to metric',name) response = client.put_metric_data( Namespace='MWAA-Custom', MetricData=[ { 'MetricName': name, 'Dimensions': [ { 'Name': 'Environment', 'Value': environment_name }, { 'Name': 'Category', 'Value': cat }, { 'Name': 'Host', 'Value': ip_address }, ], 'Timestamp': datetime.now(), 'Value': value_number, 'Unit': unit }, ] ) print(response) return response def python_fn(**kwargs): client = boto3.client('cloudwatch') cpu_stats = psutil.cpu_stats() print('cpu_stats', cpu_stats) virtual = psutil.virtual_memory() cpu_times_percent = psutil.cpu_times_percent(interval=0) publish_metric(client=client, name='virtual_memory_total', cat='virtual_memory', value=virtual.total, unit='Bytes') publish_metric(client=client, name='virtual_memory_available', cat='virtual_memory', value=virtual.available, unit='Bytes') publish_metric(client=client, name='virtual_memory_used', cat='virtual_memory', value=virtual.used, unit='Bytes') Code example 285 Amazon Managed Workflows for Apache Airflow User Guide publish_metric(client=client, name='virtual_memory_free', cat='virtual_memory', value=virtual.free, unit='Bytes') publish_metric(client=client, name='virtual_memory_active', cat='virtual_memory', value=virtual.active, unit='Bytes') publish_metric(client=client, name='virtual_memory_inactive', cat='virtual_memory', value=virtual.inactive, unit='Bytes') publish_metric(client=client, name='virtual_memory_percent', cat='virtual_memory', value=virtual.percent, unit='Percent') publish_metric(client=client, name='cpu_times_percent_user', cat='cpu_times_percent', value=cpu_times_percent.user, unit='Percent') publish_metric(client=client, name='cpu_times_percent_system', cat='cpu_times_percent', value=cpu_times_percent.system, unit='Percent') publish_metric(client=client, name='cpu_times_percent_idle', cat='cpu_times_percent', value=cpu_times_percent.idle, unit='Percent') return "OK" with DAG(dag_id=os.path.basename(__file__).replace(".py", ""), schedule_interval='*/5 * * * *', catchup=False, start_date=days_ago(1)) as dag: t = PythonOperator(task_id="memory_test", python_callable=python_fn, provide_context=True) 3. Run the following AWS CLI command to copy the DAG to your environment's bucket, then trigger the DAG using the Apache Airflow UI. $ aws s3 cp your-dag.py s3://your-environment-bucket/dags/ 4. If the DAG runs successfully, you should see something similar to the following in your Apache Airflow logs: [2022-08-16, 10:54:46 UTC] {{logging_mixin.py:109}} INFO - cpu_stats scpustats(ctx_switches=3253992384, interrupts=1964237163, soft_interrupts=492328209, syscalls=0) [2022-08-16, 10:54:46 UTC] {{logging_mixin.py:109}} INFO - writing value 16024199168.0 to metric virtual_memory_total [2022-08-16, 10:54:46 UTC] {{logging_mixin.py:109}} INFO - {'ResponseMetadata': {'RequestId': 'fad289ac-aa51-46a9-8b18-24e4e4063f4d', 'HTTPStatusCode': 200, 'HTTPHeaders': {'x-amzn-requestid': 'fad289ac-aa51-46a9-8b18-24e4e4063f4d', 'content-type': 'text/xml', 'content-length': '212', 'date': 'Tue, 16 Aug 2022 17:54:45 GMT'}, 'RetryAttempts': 0}} Code example 286 Amazon Managed Workflows for Apache Airflow User Guide [2022-08-16, 10:54:46 UTC] {{logging_mixin.py:109}} INFO - writing value 14356287488.0 to metric virtual_memory_available [2022-08-16, 10:54:46 UTC] {{logging_mixin.py:109}} INFO - {'ResponseMetadata': {'RequestId': '6ef60085-07ab-4865-8abf-dc94f90cab46', 'HTTPStatusCode': 200, 'HTTPHeaders': {'x-amzn-requestid': '6ef60085-07ab-4865-8abf-dc94f90cab46', 'content-type': 'text/xml', 'content-length': '212', 'date': 'Tue, 16 Aug 2022 17:54:45 GMT'}, 'RetryAttempts': 0}} [2022-08-16, 10:54:46 UTC] {{logging_mixin.py:109}} INFO - writing value 1342296064.0 to metric virtual_memory_used [2022-08-16, 10:54:46 UTC] {{logging_mixin.py:109}} INFO - {'ResponseMetadata': {'RequestId': 'd5331438-5d3c-4df2-bc42-52dcf8d60a00', 'HTTPStatusCode': 200, 'HTTPHeaders': {'x-amzn-requestid': 'd5331438-5d3c-4df2-bc42-52dcf8d60a00', 'content-type': 'text/xml', 'content-length': '212', 'date': 'Tue, 16 Aug 2022 17:54:45 GMT'}, 'RetryAttempts': 0}} ... [2022-08-16, 10:54:46 UTC] {{python.py:152}} INFO - Done. Returned value was: OK [2022-08-16, 10:54:46 UTC] {{taskinstance.py:1280}} INFO - Marking task as SUCCESS. dag_id=dag-custom-metrics, task_id=memory_test, execution_date=20220816T175444, start_date=20220816T175445, end_date=20220816T175446 [2022-08-16, 10:54:46 UTC] {{local_task_job.py:154}} INFO - Task exited with return code 0 Aurora PostgreSQL database cleanup on an Amazon MWAA environment Amazon Managed Workflows for Apache Airflow uses an Aurora PostgreSQL database as the Apache Airflow metadata database, where DAG runs and task instances are stored. The following sample code periodically clears out entries from the dedicated Aurora PostgreSQL database for your Amazon MWAA environment. Topics • Version • Prerequisites • Dependencies • Code sample Aurora PostgreSQL database cleanup 287 Amazon Managed Workflows for Apache Airflow User Guide Version • You can use the code example on this page with Apache Airflow v2 in Python 3.10. Prerequisites To use the sample code on this page, you'll need the following: • An Amazon MWAA environment. Dependencies • To use this code example with Apache Airflow v2, no additional dependencies are required. The code uses the Apache Airflow v2 base install on your environment. Code sample The following DAG cleans the metadata database for the tables specified in TABLES_TO_CLEAN. The example deletes data from the specified tables that is older than 30 days. To adjust how far back the entries are deleted, set MAX_AGE_IN_DAYS to a different value. Apache Airflow v2.4 and later from airflow import DAG from airflow.models.param import Param from airflow.operators.bash_operator import BashOperator from airflow.utils.dates import days_ago from datetime import datetime, timedelta # Note: Database commands may time out if running longer than 5 minutes. If this occurs, please increase the MAX_AGE_IN_DAYS (or change # timestamp parameter to an earlier date) for initial runs, then reduce on subsequent runs until the desired retention is met. MAX_AGE_IN_DAYS = 30 # To clean specific tables, please provide a comma-separated list per Version 288 Amazon Managed Workflows for Apache Airflow User Guide # https://airflow.apache.org/docs/apache-airflow/stable/cli-and-env-variables- ref.html#clean # A value of None will clean all tables TABLES_TO_CLEAN = None with DAG( dag_id="clean_db_dag", schedule_interval=None, catchup=False, start_date=days_ago(1), params={ "timestamp": Param( default=(datetime.now()-timedelta(days=MAX_AGE_IN_DAYS)).strftime("%Y- %m-%d %H:%M:%S"), type="string", minLength=1, maxLength=255, ), } ) as dag: if TABLES_TO_CLEAN: bash_command="airflow db clean --clean-before-timestamp '{{ params.timestamp }}' --tables '"+TABLES_TO_CLEAN+"' --skip-archive --yes" else: bash_command="airflow db clean --clean-before-timestamp '{{ params.timestamp }}' --skip-archive --yes" cli_command = BashOperator( task_id="bash_command", bash_command=bash_command ) Apache Airflow v2.2 and earlier from airflow import settings from airflow.utils.dates import days_ago from airflow.models
amazon-mwaa-user-guide-087
amazon-mwaa-user-guide.pdf
87
# To clean specific tables, please provide a comma-separated list per Version 288 Amazon Managed Workflows for Apache Airflow User Guide # https://airflow.apache.org/docs/apache-airflow/stable/cli-and-env-variables- ref.html#clean # A value of None will clean all tables TABLES_TO_CLEAN = None with DAG( dag_id="clean_db_dag", schedule_interval=None, catchup=False, start_date=days_ago(1), params={ "timestamp": Param( default=(datetime.now()-timedelta(days=MAX_AGE_IN_DAYS)).strftime("%Y- %m-%d %H:%M:%S"), type="string", minLength=1, maxLength=255, ), } ) as dag: if TABLES_TO_CLEAN: bash_command="airflow db clean --clean-before-timestamp '{{ params.timestamp }}' --tables '"+TABLES_TO_CLEAN+"' --skip-archive --yes" else: bash_command="airflow db clean --clean-before-timestamp '{{ params.timestamp }}' --skip-archive --yes" cli_command = BashOperator( task_id="bash_command", bash_command=bash_command ) Apache Airflow v2.2 and earlier from airflow import settings from airflow.utils.dates import days_ago from airflow.models import DagTag, DagModel, DagRun, ImportError, Log, SlaMiss, RenderedTaskInstanceFields, TaskInstance, TaskReschedule, XCom from airflow.decorators import dag, task from airflow.utils.dates import days_ago from time import sleep Code sample 289 Amazon Managed Workflows for Apache Airflow User Guide from airflow.version import version major_version, minor_version = int(version.split('.')[0]), int(version.split('.') [1]) if major_version >= 2 and minor_version >= 6: from airflow.jobs.job import Job else: # The BaseJob class was renamed as of Apache Airflow v2.6 from airflow.jobs.base_job import BaseJob as Job # Delete entries for the past 30 days. Adjust MAX_AGE_IN_DAYS to set how far back this DAG cleans the database. MAX_AGE_IN_DAYS = 30 MIN_AGE_IN_DAYS = 0 DECREMENT = -7 # This is a list of (table, time) tuples. # table = the table to clean in the metadata database # time = the column in the table associated to the timestamp of an entry # or None if not applicable. TABLES_TO_CLEAN = [[Job, Job.latest_heartbeat], [TaskInstance, TaskInstance.execution_date], [TaskReschedule, TaskReschedule.execution_date], [DagTag, None], [DagModel, DagModel.last_parsed_time], [DagRun, DagRun.execution_date], [ImportError, ImportError.timestamp], [Log, Log.dttm], [SlaMiss, SlaMiss.execution_date], [RenderedTaskInstanceFields, RenderedTaskInstanceFields.execution_date], [XCom, XCom.execution_date], ] @task() def cleanup_db_fn(x): session = settings.Session() if x[1]: for oldest_days_ago in range(MAX_AGE_IN_DAYS, MIN_AGE_IN_DAYS, DECREMENT): earliest_days_ago = max(oldest_days_ago + DECREMENT, MIN_AGE_IN_DAYS) print(f"deleting {str(x[0])} entries between {earliest_days_ago} and {oldest_days_ago} days old...") earliest_date = days_ago(earliest_days_ago) oldest_date = days_ago(oldest_days_ago) Code sample 290 Amazon Managed Workflows for Apache Airflow User Guide query = session.query(x[0]).filter(x[1] >= earliest_date).filter(x[1] <= oldest_date) query.delete(synchronize_session= False) session.commit() sleep(5) else: # No time column specified for the table. Delete all entries print("deleting", str(x[0]), "...") query = session.query(x[0]) query.delete(synchronize_session= False) session.commit() session.close() @dag( dag_id="cleanup_db", schedule_interval="@weekly", start_date=days_ago(7), catchup=False, is_paused_upon_creation=False ) def clean_db_dag_fn(): t_last=None for x in TABLES_TO_CLEAN: t=cleanup_db_fn(x) if t_last: t_last >> t t_last = t clean_db_dag = clean_db_dag_fn() Exporting environment metadata to CSV files on Amazon S3 The following code example shows how you can create a directed acyclic graph (DAG) that queries the database for a range of DAG run information, and writes the data to .csv files stored on Amazon S3. You might want to export information from the your environment's Aurora PostgreSQL database in order to inspect the data locally, archive them in object storage, or combine them with tools like Exporting environment metadata to Amazon S3 291 Amazon Managed Workflows for Apache Airflow User Guide the Amazon S3 to Amazon Redshift operator and the database cleanup, in order to move Amazon MWAA metadata out of the environment, but preserve them for future analysis. You can query the database for any of the objects listed in Apache Airflow models. This code sample uses three models, DagRun, TaskFail, and TaskInstance, which provide information relevant to DAG runs. Topics • Version • Prerequisites • Permissions • Requirements • Code sample Version • You can use the code example on this page with Apache Airflow v2 in Python 3.10. Prerequisites To use the sample code on this page, you'll need the following: • An Amazon MWAA environment. • A new Amazon S3 bucket where you want to export your metadata information. Permissions Amazon MWAA needs permission for the Amazon S3 action s3:PutObject to write the queried metadata information to Amazon S3. Add the following policy statement to your environment's execution role. { "Effect": "Allow", "Action": "s3:PutObject*", "Resource": "arn:aws:s3:::your-new-export-bucket" } Version 292 Amazon Managed Workflows for Apache Airflow User Guide This policy limits write access to only your-new-export-bucket. Requirements • To use this code example with Apache Airflow v2, no additional dependencies are required. The code uses the Apache Airflow v2 base install on your environment. Code sample The following steps describe how you can create a DAG that queries the Aurora PostgreSQL and writes the result to your new Amazon S3 bucket. 1. In your terminal, navigate to the directory where your DAG code is stored. For example: cd dags 2. Copy the contents of the following code example and save it locally as metadata_to_csv.py. You can change the value assigned to MAX_AGE_IN_DAYS to control the age of the oldest records your DAG queries from the metadata database. from airflow.decorators import dag, task from airflow import settings import os import boto3 from airflow.utils.dates import days_ago from airflow.models import DagRun, TaskFail, TaskInstance import csv, re from io import StringIO DAG_ID = os.path.basename(__file__).replace(".py", "") MAX_AGE_IN_DAYS = 30 S3_BUCKET = '<your-export-bucket>' S3_KEY = 'files/export/{0}.csv' # You
amazon-mwaa-user-guide-088
amazon-mwaa-user-guide.pdf
88
In your terminal, navigate to the directory where your DAG code is stored. For example: cd dags 2. Copy the contents of the following code example and save it locally as metadata_to_csv.py. You can change the value assigned to MAX_AGE_IN_DAYS to control the age of the oldest records your DAG queries from the metadata database. from airflow.decorators import dag, task from airflow import settings import os import boto3 from airflow.utils.dates import days_ago from airflow.models import DagRun, TaskFail, TaskInstance import csv, re from io import StringIO DAG_ID = os.path.basename(__file__).replace(".py", "") MAX_AGE_IN_DAYS = 30 S3_BUCKET = '<your-export-bucket>' S3_KEY = 'files/export/{0}.csv' # You can add other objects to export from the metadatabase, OBJECTS_TO_EXPORT = [ [DagRun,DagRun.execution_date], [TaskFail,TaskFail.execution_date], [TaskInstance, TaskInstance.execution_date], ] Requirements 293 Amazon Managed Workflows for Apache Airflow User Guide @task() def export_db_task(**kwargs): session = settings.Session() print("session: ",str(session)) oldest_date = days_ago(MAX_AGE_IN_DAYS) print("oldest_date: ",oldest_date) s3 = boto3.client('s3') for x in OBJECTS_TO_EXPORT: query = session.query(x[0]).filter(x[1] >= days_ago(MAX_AGE_IN_DAYS)) print("type",type(query)) allrows=query.all() name=re.sub("[<>']", "", str(x[0])) print(name,": ",str(allrows)) if len(allrows) > 0: outfileStr="" f = StringIO(outfileStr) w = csv.DictWriter(f, vars(allrows[0]).keys()) w.writeheader() for y in allrows: w.writerow(vars(y)) outkey = S3_KEY.format(name[6:]) s3.put_object(Bucket=S3_BUCKET, Key=outkey, Body=f.getvalue()) @dag( dag_id=DAG_ID, schedule_interval=None, start_date=days_ago(1), ) def export_db(): t = export_db_task() metadb_to_s3_test = export_db() 3. Run the following AWS CLI command to copy the DAG to your environment's bucket, then trigger the DAG using the Apache Airflow UI. $ aws s3 cp your-dag.py s3://your-environment-bucket/dags/ Code sample 294 Amazon Managed Workflows for Apache Airflow User Guide 4. If successful, you'll output similar to the following in the task logs for the export_db task: [2022-01-01, 12:00:00 PDT] {{logging_mixin.py:109}} INFO - type <class 'sqlalchemy.orm.query.Query'> [2022-01-01, 12:00:00 PDT] {{logging_mixin.py:109}} INFO - class airflow.models.dagrun.DagRun : [your-tasks] [2022-01-01, 12:00:00 PDT] {{logging_mixin.py:109}} INFO - type <class 'sqlalchemy.orm.query.Query'> [2022-01-01, 12:00:00 PDT] {{logging_mixin.py:109}} INFO - class airflow.models.taskfail.TaskFail : [your-tasks] [2022-01-01, 12:00:00 PDT] {{logging_mixin.py:109}} INFO - type <class 'sqlalchemy.orm.query.Query'> [2022-01-01, 12:00:00 PDT] {{logging_mixin.py:109}} INFO - class airflow.models.taskinstance.TaskInstance : [your-tasks] [2022-01-01, 12:00:00 PDT] {{python.py:152}} INFO - Done. Returned value was: OK [2022-01-01, 12:00:00 PDT] {{taskinstance.py:1280}} INFO - Marking task as SUCCESS. dag_id=metadb_to_s3, task_id=export_db, execution_date=20220101T000000, start_date=20220101T000000, end_date=20220101T000000 [2022-01-01, 12:00:00 PDT] {{local_task_job.py:154}} INFO - Task exited with return code 0 [2022-01-01, 12:00:00 PDT] {{local_task_job.py:264}} INFO - 0 downstream tasks scheduled from follow-on schedule check You can now access and download the exported .csv files in your new Amazon S3 bucket in / files/export/. Using a secret key in AWS Secrets Manager for an Apache Airflow variable The following sample calls AWS Secrets Manager to get a secret key for an Apache Airflow variable on Amazon Managed Workflows for Apache Airflow. It assumes you've completed the steps in Configuring an Apache Airflow connection using a AWS Secrets Manager secret. Topics • Version • Prerequisites • Permissions • Requirements Using an Apache Airflow variable in Secrets Manager 295 Amazon Managed Workflows for Apache Airflow User Guide • Code sample • What's next? Version • The sample code on this page can be used with Apache Airflow v1 in Python 3.7. • You can use the code example on this page with Apache Airflow v2 in Python 3.10. Prerequisites To use the sample code on this page, you'll need the following: • The Secrets Manager backend as an Apache Airflow configuration option as shown in Configuring an Apache Airflow connection using a AWS Secrets Manager secret. • An Apache Airflow variable string in Secrets Manager as shown in Configuring an Apache Airflow connection using a AWS Secrets Manager secret. Permissions • Secrets Manager permissions as shown in Configuring an Apache Airflow connection using a AWS Secrets Manager secret. Requirements • To use this code example with Apache Airflow v1, no additional dependencies are required. The code uses the Apache Airflow v1 base install on your environment. • To use this code example with Apache Airflow v2, no additional dependencies are required. The code uses the Apache Airflow v2 base install on your environment. Version 296 Amazon Managed Workflows for Apache Airflow User Guide Code sample The following steps describe how to create the DAG code that calls Secrets Manager to get the secret. 1. In your command prompt, navigate to the directory where your DAG code is stored. For example: cd dags 2. Copy the contents of the following code sample and save locally as secrets-manager- var.py. from airflow import DAG from airflow.operators.python_operator import PythonOperator from airflow.models import Variable from airflow.utils.dates import days_ago from datetime import timedelta import os DAG_ID = os.path.basename(__file__).replace(".py", "") DEFAULT_ARGS = { 'owner': 'airflow', 'depends_on_past': False, 'email': ['airflow@example.com'], 'email_on_failure': False, 'email_on_retry': False, } def get_variable_fn(**kwargs): my_variable_name = Variable.get("test-variable", default_var="undefined") print("my_variable_name: ", my_variable_name) return my_variable_name with DAG( dag_id=DAG_ID, default_args=DEFAULT_ARGS, dagrun_timeout=timedelta(hours=2), start_date=days_ago(1), schedule_interval='@once', tags=['variable'] ) as dag: get_variable = PythonOperator( task_id="get_variable", python_callable=get_variable_fn, Code sample 297 Amazon Managed Workflows for Apache Airflow User Guide provide_context=True ) What's next? • Learn how to upload the DAG code in this example to the dags folder
amazon-mwaa-user-guide-089
amazon-mwaa-user-guide.pdf
89
as secrets-manager- var.py. from airflow import DAG from airflow.operators.python_operator import PythonOperator from airflow.models import Variable from airflow.utils.dates import days_ago from datetime import timedelta import os DAG_ID = os.path.basename(__file__).replace(".py", "") DEFAULT_ARGS = { 'owner': 'airflow', 'depends_on_past': False, 'email': ['airflow@example.com'], 'email_on_failure': False, 'email_on_retry': False, } def get_variable_fn(**kwargs): my_variable_name = Variable.get("test-variable", default_var="undefined") print("my_variable_name: ", my_variable_name) return my_variable_name with DAG( dag_id=DAG_ID, default_args=DEFAULT_ARGS, dagrun_timeout=timedelta(hours=2), start_date=days_ago(1), schedule_interval='@once', tags=['variable'] ) as dag: get_variable = PythonOperator( task_id="get_variable", python_callable=get_variable_fn, Code sample 297 Amazon Managed Workflows for Apache Airflow User Guide provide_context=True ) What's next? • Learn how to upload the DAG code in this example to the dags folder in your Amazon S3 bucket in Adding or updating DAGs. Using a secret key in AWS Secrets Manager for an Apache Airflow connection The following sample calls AWS Secrets Manager to get a secret key for an Apache Airflow connection on Amazon Managed Workflows for Apache Airflow. It assumes you've completed the steps in Configuring an Apache Airflow connection using a AWS Secrets Manager secret. Topics • Version • Prerequisites • Permissions • Requirements • Code sample • What's next? Version • The sample code on this page can be used with Apache Airflow v1 in Python 3.7. • You can use the code example on this page with Apache Airflow v2 in Python 3.10. Prerequisites To use the sample code on this page, you'll need the following: What's next? 298 Amazon Managed Workflows for Apache Airflow User Guide • The Secrets Manager backend as an Apache Airflow configuration option as shown in Configuring an Apache Airflow connection using a AWS Secrets Manager secret. • An Apache Airflow connection string in Secrets Manager as shown in Configuring an Apache Airflow connection using a AWS Secrets Manager secret. Permissions • Secrets Manager permissions as shown in Configuring an Apache Airflow connection using a AWS Secrets Manager secret. Requirements • To use this code example with Apache Airflow v1, no additional dependencies are required. The code uses the Apache Airflow v1 base install on your environment. • To use this code example with Apache Airflow v2, no additional dependencies are required. The code uses the Apache Airflow v2 base install on your environment. Code sample The following steps describe how to create the DAG code that calls Secrets Manager to get the secret. Apache Airflow v2 1. In your command prompt, navigate to the directory where your DAG code is stored. For example: cd dags 2. Copy the contents of the following code sample and save locally as secrets- manager.py. """ Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of Permissions 299 Amazon Managed Workflows for Apache Airflow User Guide this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. """ from airflow import DAG, settings, secrets from airflow.operators.python import PythonOperator from airflow.utils.dates import days_ago from airflow.providers.amazon.aws.hooks.base_aws import AwsBaseHook from datetime import timedelta import os ### The steps to create this secret key can be found at: https:// docs.aws.amazon.com/mwaa/latest/userguide/connections-secrets-manager.html sm_secretId_name = 'airflow/connections/myconn' default_args = { 'owner': 'airflow', 'start_date': days_ago(1), 'depends_on_past': False } ### Gets the secret myconn from Secrets Manager def read_from_aws_sm_fn(**kwargs): ### set up Secrets Manager hook = AwsBaseHook(client_type='secretsmanager') client = hook.get_client_type(region_name='us-east-1') response = client.get_secret_value(SecretId=sm_secretId_name) myConnSecretString = response["SecretString"] return myConnSecretString ### 'os.path.basename(__file__).replace(".py", "")' uses the file name secrets- manager.py for a DAG ID of secrets-manager with DAG( Code sample 300 Amazon Managed Workflows for Apache Airflow User Guide dag_id=os.path.basename(__file__).replace(".py", ""), default_args=default_args, dagrun_timeout=timedelta(hours=2), start_date=days_ago(1), schedule_interval=None ) as dag: write_all_to_aws_sm = PythonOperator( task_id="read_from_aws_sm", python_callable=read_from_aws_sm_fn, provide_context=True ) Apache Airflow v1 1. In your command prompt, navigate to the directory where your DAG code is stored. For example: cd dags 2. Copy the contents of the following code sample and save locally as secrets- manager.py. from airflow import DAG, settings, secrets from airflow.operators.python_operator import PythonOperator from airflow.utils.dates import days_ago from airflow.contrib.hooks.aws_hook import AwsHook from datetime import timedelta import os ### The steps to create this secret key can be found at: https:// docs.aws.amazon.com/mwaa/latest/userguide/connections-secrets-manager.html sm_secretId_name = 'airflow/connections/myconn' default_args = { 'owner': 'airflow', 'start_date': days_ago(1), 'depends_on_past': False } ### Gets the secret myconn from Secrets Manager Code sample 301 Amazon
amazon-mwaa-user-guide-090
amazon-mwaa-user-guide.pdf
90
Apache Airflow v1 1. In your command prompt, navigate to the directory where your DAG code is stored. For example: cd dags 2. Copy the contents of the following code sample and save locally as secrets- manager.py. from airflow import DAG, settings, secrets from airflow.operators.python_operator import PythonOperator from airflow.utils.dates import days_ago from airflow.contrib.hooks.aws_hook import AwsHook from datetime import timedelta import os ### The steps to create this secret key can be found at: https:// docs.aws.amazon.com/mwaa/latest/userguide/connections-secrets-manager.html sm_secretId_name = 'airflow/connections/myconn' default_args = { 'owner': 'airflow', 'start_date': days_ago(1), 'depends_on_past': False } ### Gets the secret myconn from Secrets Manager Code sample 301 Amazon Managed Workflows for Apache Airflow User Guide def read_from_aws_sm_fn(**kwargs): ### set up Secrets Manager hook = AwsHook() client = hook.get_client_type('secretsmanager') response = client.get_secret_value(SecretId=sm_secretId_name) myConnSecretString = response["SecretString"] return myConnSecretString ### 'os.path.basename(__file__).replace(".py", "")' uses the file name secrets- manager.py for a DAG ID of secrets-manager with DAG( dag_id=os.path.basename(__file__).replace(".py", ""), default_args=default_args, dagrun_timeout=timedelta(hours=2), start_date=days_ago(1), schedule_interval=None ) as dag: write_all_to_aws_sm = PythonOperator( task_id="read_from_aws_sm", python_callable=read_from_aws_sm_fn, provide_context=True ) What's next? • Learn how to upload the DAG code in this example to the dags folder in your Amazon S3 bucket in Adding or updating DAGs. Creating a custom plugin with Oracle The following sample walks you through the steps to create a custom plugin using Oracle for Amazon MWAA and can be combined with other custom plugins and binaries in your plugins.zip file. Contents • Version • Prerequisites • Permissions What's next? 302 Amazon Managed Workflows for Apache Airflow User Guide • Requirements • Code sample • Create the custom plugin • Download dependencies • Custom plugin • Plugins.zip • Airflow configuration options • What's next? Version • The sample code on this page can be used with Apache Airflow v1 in Python 3.7. • You can use the code example on this page with Apache Airflow v2 in Python 3.10. Prerequisites To use the sample code on this page, you'll need the following: • An Amazon MWAA environment. • Worker logging enabled at any log level, CRITICAL or above, for your environment. For more information about Amazon MWAA log types and how to manage your log groups, see the section called “Viewing Airflow logs” Permissions • No additional permissions are required to use the code example on this page. Requirements To use the sample code on this page, add the following dependencies to your requirements.txt. To learn more, see Installing Python dependencies. Version 303 Amazon Managed Workflows for Apache Airflow User Guide Apache Airflow v2 -c https://raw.githubusercontent.com/apache/airflow/constraints-2.0.2/ constraints-3.7.txt cx_Oracle apache-airflow-providers-oracle Apache Airflow v1 cx_Oracle==8.1.0 apache-airflow[oracle]==1.10.12 Code sample The following steps describe how to create the DAG code that will test the custom plugin. 1. In your command prompt, navigate to the directory where your DAG code is stored. For example: cd dags 2. Copy the contents of the following code sample and save locally as oracle.py. from airflow import DAG from airflow.operators.python_operator import PythonOperator from airflow.utils.dates import days_ago import os import cx_Oracle DAG_ID = os.path.basename(__file__).replace(".py", "") def testHook(**kwargs): cx_Oracle.init_oracle_client() version = cx_Oracle.clientversion() print("cx_Oracle.clientversion",version) return version with DAG(dag_id=DAG_ID, schedule_interval=None, catchup=False, start_date=days_ago(1)) as dag: hook_test = PythonOperator( task_id="hook_test", Code sample 304 Amazon Managed Workflows for Apache Airflow User Guide python_callable=testHook, provide_context=True ) Create the custom plugin This section describes how to download the dependencies, create the custom plugin and the plugins.zip. Download dependencies Amazon MWAA will extract the contents of plugins.zip into /usr/local/airflow/plugins on each Amazon MWAA scheduler and worker container. This is used to add binaries to your environment. The following steps describe how to assemble the files needed for the custom plugin. Pull the Amazon Linux container image 1. In your command prompt, pull the Amazon Linux container image, and run the container locally. For example: docker pull amazonlinux docker run -it amazonlinux:latest /bin/bash Your command prompt should invoke a bash command line. For example: bash-4.2# 2. Install the Linux-native asynchronous I/O facility (libaio). yum -y install libaio 3. Keep this window open for subsequent steps. We'll be copying the following files locally: lib64/libaio.so.1, lib64/libaio.so.1.0.0, lib64/libaio.so.1.0.1. Download client folder 1. Install the unzip package locally. For example: sudo yum install unzip Create the custom plugin 305 Amazon Managed Workflows for Apache Airflow User Guide 2. Create an oracle_plugin directory. For example: mkdir oracle_plugin cd oracle_plugin 3. Use the following curl command to download the instantclient-basic- linux.x64-18.5.0.0.0dbru.zip from Oracle Instant Client Downloads for Linux x86-64 (64-bit). curl https://download.oracle.com/otn_software/linux/instantclient/185000/ instantclient-basic-linux.x64-18.5.0.0.0dbru.zip > client.zip 4. Unzip the client.zip file. For example: unzip *.zip Extract files from Docker 1. In a new command prompt, display and write down your Docker container ID. For example: docker container ls Your command prompt should return all containers and their IDs. For example: debc16fd6970 2. In your oracle_plugin directory, extract the lib64/libaio.so.1, lib64/ libaio.so.1.0.0, lib64/libaio.so.1.0.1 files to the local instantclient_18_5 folder. For example: docker cp debc16fd6970:/lib64/libaio.so.1 instantclient_18_5/ docker cp debc16fd6970:/lib64/libaio.so.1.0.0 instantclient_18_5/ docker cp debc16fd6970:/lib64/libaio.so.1.0.1 instantclient_18_5/
amazon-mwaa-user-guide-091
amazon-mwaa-user-guide.pdf
91
Use the following curl command to download the instantclient-basic- linux.x64-18.5.0.0.0dbru.zip from Oracle Instant Client Downloads for Linux x86-64 (64-bit). curl https://download.oracle.com/otn_software/linux/instantclient/185000/ instantclient-basic-linux.x64-18.5.0.0.0dbru.zip > client.zip 4. Unzip the client.zip file. For example: unzip *.zip Extract files from Docker 1. In a new command prompt, display and write down your Docker container ID. For example: docker container ls Your command prompt should return all containers and their IDs. For example: debc16fd6970 2. In your oracle_plugin directory, extract the lib64/libaio.so.1, lib64/ libaio.so.1.0.0, lib64/libaio.so.1.0.1 files to the local instantclient_18_5 folder. For example: docker cp debc16fd6970:/lib64/libaio.so.1 instantclient_18_5/ docker cp debc16fd6970:/lib64/libaio.so.1.0.0 instantclient_18_5/ docker cp debc16fd6970:/lib64/libaio.so.1.0.1 instantclient_18_5/ Custom plugin Apache Airflow will execute the contents of Python files in the plugins folder at startup. This is used to set and modify environment variables. The following steps describe the sample code for the custom plugin. Create the custom plugin 306 Amazon Managed Workflows for Apache Airflow User Guide • Copy the contents of the following code sample and save locally as env_var_plugin_oracle.py. from airflow.plugins_manager import AirflowPlugin import os os.environ["LD_LIBRARY_PATH"]='/usr/local/airflow/plugins/instantclient_18_5' os.environ["DPI_DEBUG_LEVEL"]="64" class EnvVarPlugin(AirflowPlugin): name = 'env_var_plugin' Plugins.zip The following steps show how to create the plugins.zip. The contents of this example can be combined with your other plugins and binaries into a single plugins.zip file. Zip the contents of the plugin directory 1. In your command prompt, navigate to the oracle_plugin directory. For example: cd oracle_plugin 2. Zip the instantclient_18_5 directory in plugins.zip. For example: zip -r ../plugins.zip ./ 3. You should see the following in your command prompt: oracle_plugin$ ls client.zip instantclient_18_5 4. Remove the client.zip file. For example: rm client.zip Zip the env_var_plugin_oracle.py file 1. Add the env_var_plugin_oracle.py file to the root of the plugins.zip. For example: Create the custom plugin 307 Amazon Managed Workflows for Apache Airflow User Guide zip plugins.zip env_var_plugin_oracle.py 2. Your plugins.zip should now include the following: env_var_plugin_oracle.py instantclient_18_5/ Airflow configuration options If you're using Apache Airflow v2, add core.lazy_load_plugins : False as an Apache Airflow configuration option. To learn more, see Using configuration options to load plugins in 2. What's next? • Learn how to upload the requirements.txt file in this example to your Amazon S3 bucket in Installing Python dependencies. • Learn how to upload the DAG code in this example to the dags folder in your Amazon S3 bucket in Adding or updating DAGs. • Learn more about how to upload the plugins.zip file in this example to your Amazon S3 bucket in Installing custom plugins. Creating a custom plugin that generates runtime environment variables The following sample walks you through the steps to create a custom plugin that generates environment variables at runtime on an Amazon Managed Workflows for Apache Airflow environment. Topics • Version • Prerequisites • Permissions • Requirements • Custom plugin Airflow configuration options 308 Amazon Managed Workflows for Apache Airflow User Guide • Plugins.zip • Airflow configuration options • What's next? Version • The sample code on this page can be used with Apache Airflow v1 in Python 3.7. Prerequisites To use the sample code on this page, you'll need the following: • An Amazon MWAA environment. Permissions • No additional permissions are required to use the code example on this page. Requirements • To use this code example with Apache Airflow v1, no additional dependencies are required. The code uses the Apache Airflow v1 base install on your environment. Custom plugin Apache Airflow will execute the contents of Python files in the plugins folder at startup. This is used to set and modify environment variables. The following steps describe the sample code for the custom plugin. 1. In your command prompt, navigate to the directory where your plugins are stored. For example: cd plugins 2. Copy the contents of the following code sample and save locally as env_var_plugin.py in the above folder. Version 309 Amazon Managed Workflows for Apache Airflow User Guide from airflow.plugins_manager import AirflowPlugin import os os.environ["PATH"] = os.getenv("PATH") + ":/usr/local/airflow/.local/lib/python3.7/ site-packages" os.environ["JAVA_HOME"]="/usr/lib/jvm/java-1.8.0- openjdk-1.8.0.272.b10-1.amzn2.0.1.x86_64" class EnvVarPlugin(AirflowPlugin): name = 'env_var_plugin' Plugins.zip The following steps show how to create plugins.zip. The contents of this example can be combined with other plugins and binaries into a single plugins.zip file. 1. In your command prompt, navigate to the hive_plugin directory from the previous step. For example: cd plugins 2. Zip the contents within your plugins folder. zip -r ../plugins.zip ./ Airflow configuration options If you're using Apache Airflow v2, add core.lazy_load_plugins : False as an Apache Airflow configuration option. To learn more, see Using configuration options to load plugins in 2. What's next? • Learn how to upload the requirements.txt file in this example to your Amazon S3 bucket in Installing Python dependencies. • Learn how to upload the DAG code in this example to the dags folder in your Amazon S3 bucket in Adding or updating DAGs. Plugins.zip 310 Amazon Managed Workflows for Apache Airflow