id
stringlengths 8
78
| source
stringclasses 743
values | chunk_id
int64 1
5.05k
| text
stringlengths 593
49.7k
|
---|---|---|---|
amazon-mwaa-user-guide-092 | amazon-mwaa-user-guide.pdf | 92 | cd plugins 2. Zip the contents within your plugins folder. zip -r ../plugins.zip ./ Airflow configuration options If you're using Apache Airflow v2, add core.lazy_load_plugins : False as an Apache Airflow configuration option. To learn more, see Using configuration options to load plugins in 2. What's next? • Learn how to upload the requirements.txt file in this example to your Amazon S3 bucket in Installing Python dependencies. • Learn how to upload the DAG code in this example to the dags folder in your Amazon S3 bucket in Adding or updating DAGs. Plugins.zip 310 Amazon Managed Workflows for Apache Airflow User Guide • Learn more about how to upload the plugins.zip file in this example to your Amazon S3 bucket in Installing custom plugins. Changing a DAG's timezone on Amazon MWAA Apache Airflow schedules your directed acyclic graph (DAG) in UTC+0 by default. The following steps show how you can change the timezone in which Amazon MWAA runs your DAGs with Pendulum. Optionally, this topic demonstrates how you can create a custom plugin to change the timezone for your environment's Apache Airflow logs. Topics • Version • Prerequisites • Permissions • Create a plugin to change the timezone in Airflow logs • Create a plugins.zip • Code sample • What's next? Version • You can use the code example on this page with Apache Airflow v2 in Python 3.10. Prerequisites To use the sample code on this page, you'll need the following: • An Amazon MWAA environment. Permissions • No additional permissions are required to use the code example on this page. Changing a DAG's timezone 311 Amazon Managed Workflows for Apache Airflow User Guide Create a plugin to change the timezone in Airflow logs Apache Airflow will run the Python files in the plugins directory at start-up. With the following plugin, you can override the executor's timezone, which modifies the timezone in which Apache Airflow writes logs. 1. Create a directory named plugins for your custom plugin, and navigate to the directory. For example: $ mkdir plugins $ cd plugins 2. Copy the contents of the following code sample and save locally as dag-timezone- plugin.py in the plugins folder. import time import os os.environ['TZ'] = 'America/Los_Angeles' time.tzset() 3. In the plugins directory, create an empty Python file named __init__.py. Your plugins directory should be similar to the following: plugins/ |-- __init__.py |-- dag-timezone-plugin.py Create a plugins.zip The following steps show how to create plugins.zip. The content of this example can be combined with other plugins and binaries into a single plugins.zip file. 1. In your command prompt, navigate to the plugins directory from the previous step. For example: cd plugins 2. Zip the contents within your plugins directory. Create a plugin to change the timezone in Airflow logs 312 Amazon Managed Workflows for Apache Airflow User Guide zip -r ../plugins.zip ./ 3. Upload plugins.zip to your S3 bucket $ aws s3 cp plugins.zip s3://your-mwaa-bucket/ Code sample To change the default timezone (UTC+0) in which the DAG runs, we'll use a library called Pendulum, a Python library for working with timezone-aware datetime. 1. In your command prompt, navigate to the directory where your DAGs are stored. For example: $ cd dags 2. Copy the content of the following example and save as tz-aware-dag.py. from airflow import DAG from airflow.operators.bash_operator import BashOperator from datetime import datetime, timedelta # Import the Pendulum library. import pendulum # Instantiate Pendulum and set your timezone. local_tz = pendulum.timezone("America/Los_Angeles") with DAG( dag_id = "tz_test", schedule_interval="0 12 * * *", catchup=False, start_date=datetime(2022, 1, 1, tzinfo=local_tz) ) as dag: bash_operator_task = BashOperator( task_id="tz_aware_task", dag=dag, bash_command="date" ) 3. Run the following AWS CLI command to copy the DAG to your environment's bucket, then trigger the DAG using the Apache Airflow UI. Code sample 313 Amazon Managed Workflows for Apache Airflow User Guide $ aws s3 cp your-dag.py s3://your-environment-bucket/dags/ 4. If successful, you'll output similar to the following in the task logs for the tz_aware_task in the tz_test DAG: [2022-08-01, 12:00:00 PDT] {{subprocess.py:74}} INFO - Running command: ['bash', '- c', 'date'] [2022-08-01, 12:00:00 PDT] {{subprocess.py:85}} INFO - Output: [2022-08-01, 12:00:00 PDT] {{subprocess.py:89}} INFO - Mon Aug 1 12:00:00 PDT 2022 [2022-08-01, 12:00:00 PDT] {{subprocess.py:93}} INFO - Command exited with return code 0 [2022-08-01, 12:00:00 PDT] {{taskinstance.py:1280}} INFO - Marking task as SUCCESS. dag_id=tz_test, task_id=tz_aware_task, execution_date=20220801T190033, start_date=20220801T190035, end_date=20220801T190035 [2022-08-01, 12:00:00 PDT] {{local_task_job.py:154}} INFO - Task exited with return code 0 [2022-08-01, 12:00:00 PDT] {{local_task_job.py:264}} INFO - 0 downstream tasks scheduled from follow-on schedule check What's next? • Learn more about how to upload the plugins.zip file in this example to your Amazon S3 bucket in Installing custom plugins. Refreshing a CodeArtifact token If you're using CodeArtifact to install Python dependencies, Amazon MWAA requires an active token. To allow Amazon MWAA to access a CodeArtifact repository at runtime, you can use a |
amazon-mwaa-user-guide-093 | amazon-mwaa-user-guide.pdf | 93 | code 0 [2022-08-01, 12:00:00 PDT] {{taskinstance.py:1280}} INFO - Marking task as SUCCESS. dag_id=tz_test, task_id=tz_aware_task, execution_date=20220801T190033, start_date=20220801T190035, end_date=20220801T190035 [2022-08-01, 12:00:00 PDT] {{local_task_job.py:154}} INFO - Task exited with return code 0 [2022-08-01, 12:00:00 PDT] {{local_task_job.py:264}} INFO - 0 downstream tasks scheduled from follow-on schedule check What's next? • Learn more about how to upload the plugins.zip file in this example to your Amazon S3 bucket in Installing custom plugins. Refreshing a CodeArtifact token If you're using CodeArtifact to install Python dependencies, Amazon MWAA requires an active token. To allow Amazon MWAA to access a CodeArtifact repository at runtime, you can use a startup script and set the PIP_EXTRA_INDEX_URL with the token. The following topic describes how you can create a startup script that uses the get_authorization_token CodeArtifact API operation to retrieve a fresh token every time your environment starts up, or updates. Topics • Version What's next? 314 Amazon Managed Workflows for Apache Airflow User Guide • Prerequisites • Permissions • Code sample • What's next? Version • You can use the code example on this page with Apache Airflow v2 in Python 3.10. Prerequisites To use the sample code on this page, you'll need the following: • An Amazon MWAA environment. • A CodeArtifact repository where you store dependencies for your environment. Permissions To refresh the CodeArtifact token and write the result to Amazon S3 Amazon MWAA must have the following permissions in the execution role. • The codeartifact:GetAuthorizationToken action allows Amazon MWAA to retrieve a new token from CodeArtifact. The following policy grants permission for every CodeArtifact domain you create. You can further restrict access to your domains by modifying the resource value in the statement, and specifying only the domains that you want your environment to access. { "Effect": "Allow", "Action": "codeartifact:GetAuthorizationToken", "Resource": "arn:aws:codeartifact:us-west-2:*:domain/*" } • The sts:GetServiceBearerToken action is required to call the CodeArtifact GetAuthorizationToken API operation. This operation returns a token that must be used when using a package manager such as pip with CodeArtifact. To use a package manager with a CodeArtifact repository, your environment's execution role role must allow sts:GetServiceBearerToken as shown in the following policy statement. Version 315 Amazon Managed Workflows for Apache Airflow User Guide { "Sid": "AllowServiceBearerToken", "Effect": "Allow", "Action": "sts:GetServiceBearerToken", "Resource": "*" } Code sample The following steps describe how you can create a start up script that updates the CodeArtifact token. 1. Copy the contents of the following code sample and save locally as code_artifact_startup_script.sh. #!/bin/sh # Startup script for MWAA, see https://docs.aws.amazon.com/mwaa/latest/userguide/ using-startup-script.html set -eu # setup code artifact endpoint and token # https://pip.pypa.io/en/stable/cli/pip_install/#cmdoption-0 # https://docs.aws.amazon.com/mwaa/latest/userguide/samples-code-artifact.html DOMAIN="amazon" DOMAIN_OWNER="112233445566" REGION="us-west-2" REPO_NAME="MyRepo" echo "Getting token for CodeArtifact with args: --domain $DOMAIN --region $REGION --domain-owner $DOMAIN_OWNER" TOKEN=$(aws codeartifact get-authorization-token --domain $DOMAIN --region $REGION --domain-owner $DOMAIN_OWNER | jq -r '.authorizationToken') echo "Setting Pip env var for '--index-url' to point to CodeArtifact" export PIP_EXTRA_INDEX_URL="https://aws:$TOKEN@$DOMAIN- $DOMAIN_OWNER.d.codeartifact.$REGION.amazonaws.com/pypi/$REPO_NAME/simple/" echo "CodeArtifact startup setup complete" 2. Navigate to the folder where you saved the script. Use cp in a new prompt window to upload the script to your bucket. Replace your-s3-bucket with your information. Code sample 316 Amazon Managed Workflows for Apache Airflow User Guide $ aws s3 cp code_artifact_startup_script.sh s3://your-s3-bucket/ code_artifact_startup_script.sh If successful, Amazon S3 outputs the URL path to the object: upload: ./code_artifact_startup_script.sh to s3://your-s3-bucket/ code_artifact_startup_script.sh After you upload the script, your environment updates and runs the script at startup. What's next? • Learn how to use startup scripts to customize your environment in the section called “Using a startup script”. • Learn how to upload the DAG code in this example to the dags folder in your Amazon S3 bucket in Adding or updating DAGs. • Learn more about how to upload the plugins.zip file in this example to your Amazon S3 bucket in Installing custom plugins. Creating a custom plugin with Apache Hive and Hadoop Amazon MWAA extracts the contents of a plugins.zip to /usr/local/airflow/plugins. This can be used to add binaries to your containers. In addition, Apache Airflow executes the contents of Python files in the plugins folder at startup—enabling you to set and modify environment variables. The following sample walks you through the steps to create a custom plugin using Apache Hive and Hadoop on an Amazon Managed Workflows for Apache Airflow environment and can be combined with other custom plugins and binaries. Topics • Version • Prerequisites • Permissions • Requirements • Download dependencies What's next? 317 Amazon Managed Workflows for Apache Airflow User Guide • Custom plugin • Plugins.zip • Code sample • Airflow configuration options • What's next? Version • The sample code on this page can be used with Apache Airflow v1 in Python 3.7. • You can use the code example on this page with Apache Airflow v2 in Python 3.10. Prerequisites To use the sample code on this page, you'll need the following: • An Amazon MWAA |
amazon-mwaa-user-guide-094 | amazon-mwaa-user-guide.pdf | 94 | and can be combined with other custom plugins and binaries. Topics • Version • Prerequisites • Permissions • Requirements • Download dependencies What's next? 317 Amazon Managed Workflows for Apache Airflow User Guide • Custom plugin • Plugins.zip • Code sample • Airflow configuration options • What's next? Version • The sample code on this page can be used with Apache Airflow v1 in Python 3.7. • You can use the code example on this page with Apache Airflow v2 in Python 3.10. Prerequisites To use the sample code on this page, you'll need the following: • An Amazon MWAA environment. Permissions • No additional permissions are required to use the code example on this page. Requirements To use the sample code on this page, add the following dependencies to your requirements.txt. To learn more, see Installing Python dependencies. Apache Airflow v2 -c https://raw.githubusercontent.com/apache/airflow/constraints-2.0.2/ constraints-3.7.txt apache-airflow-providers-amazon[apache.hive] Version 318 Amazon Managed Workflows for Apache Airflow User Guide Apache Airflow v1 apache-airflow[hive]==1.10.12 Download dependencies Amazon MWAA will extract the contents of plugins.zip into /usr/local/airflow/plugins on each Amazon MWAA scheduler and worker container. This is used to add binaries to your environment. The following steps describe how to assemble the files needed for the custom plugin. 1. In your command prompt, navigate to the directory where you would like to create your plugin. For example: cd plugins 2. Download Hadoop from a mirror, for example: wget https://downloads.apache.org/hadoop/common/hadoop-3.3.0/hadoop-3.3.0.tar.gz 3. Download Hive from a mirror, for example: wget https://downloads.apache.org/hive/hive-3.1.2/apache-hive-3.1.2-bin.tar.gz 4. Create a directory. For example: mkdir hive_plugin 5. Extract Hadoop. tar -xvzf hadoop-3.3.0.tar.gz -C hive_plugin 6. Extract Hive. tar -xvzf apache-hive-3.1.2-bin.tar.gz -C hive_plugin Download dependencies 319 Amazon Managed Workflows for Apache Airflow User Guide Custom plugin Apache Airflow will execute the contents of Python files in the plugins folder at startup. This is used to set and modify environment variables. The following steps describe the sample code for the custom plugin. 1. In your command prompt, navigate to the hive_plugin directory. For example: cd hive_plugin 2. Copy the contents of the following code sample and save locally as hive_plugin.py in the hive_plugin directory. from airflow.plugins_manager import AirflowPlugin import os os.environ["JAVA_HOME"]="/usr/lib/jvm/jre" os.environ["HADOOP_HOME"]='/usr/local/airflow/plugins/hadoop-3.3.0' os.environ["HADOOP_CONF_DIR"]='/usr/local/airflow/plugins/hadoop-3.3.0/etc/hadoop' os.environ["HIVE_HOME"]='/usr/local/airflow/plugins/apache-hive-3.1.2-bin' os.environ["PATH"] = os.getenv("PATH") + ":/usr/local/airflow/plugins/ hadoop-3.3.0:/usr/local/airflow/plugins/apache-hive-3.1.2-bin/bin:/usr/local/ airflow/plugins/apache-hive-3.1.2-bin/lib" os.environ["CLASSPATH"] = os.getenv("CLASSPATH") + ":/usr/local/airflow/plugins/ apache-hive-3.1.2-bin/lib" class EnvVarPlugin(AirflowPlugin): name = 'hive_plugin' 3. Cope the content of the following text and save locally as .airflowignore in the hive_plugin directory. hadoop-3.3.0 apache-hive-3.1.2-bin Plugins.zip The following steps show how to create plugins.zip. The contents of this example can be combined with other plugins and binaries into a single plugins.zip file. 1. In your command prompt, navigate to the hive_plugin directory from the previous step. For example: Custom plugin 320 Amazon Managed Workflows for Apache Airflow User Guide cd hive_plugin 2. Zip the contents within your plugins folder. zip -r ../hive_plugin.zip ./ Code sample The following steps describe how to create the DAG code that will test the custom plugin. 1. In your command prompt, navigate to the directory where your DAG code is stored. For example: cd dags 2. Copy the contents of the following code sample and save locally as hive.py. from airflow import DAG from airflow.operators.bash_operator import BashOperator from airflow.utils.dates import days_ago with DAG(dag_id="hive_test_dag", schedule_interval=None, catchup=False, start_date=days_ago(1)) as dag: hive_test = BashOperator( task_id="hive_test", bash_command='hive --help' ) Airflow configuration options If you're using Apache Airflow v2, add core.lazy_load_plugins : False as an Apache Airflow configuration option. To learn more, see Using configuration options to load plugins in 2. What's next? • Learn how to upload the requirements.txt file in this example to your Amazon S3 bucket in Installing Python dependencies. Code sample 321 Amazon Managed Workflows for Apache Airflow User Guide • Learn how to upload the DAG code in this example to the dags folder in your Amazon S3 bucket in Adding or updating DAGs. • Learn more about how to upload the plugins.zip file in this example to your Amazon S3 bucket in Installing custom plugins. Creating a custom plugin for Apache Airflow PythonVirtualenvOperator The following sample shows how to patch the Apache Airflow PythonVirtualenvOperator with a custom plugin on Amazon Managed Workflows for Apache Airflow. Topics • Version • Prerequisites • Permissions • Requirements • Custom plugin sample code • Plugins.zip • Code sample • Airflow configuration options • What's next? Version • The sample code on this page can be used with Apache Airflow v1 in Python 3.7. • You can use the code example on this page with Apache Airflow v2 in Python 3.10. Prerequisites To use the sample code on this page, you'll need the following: • An Amazon MWAA environment. Custom plugin to patch PythonVirtualenvOperator 322 Amazon Managed Workflows for Apache Airflow User Guide Permissions • No additional permissions are required to use the code example on this page. Requirements To use the sample code on this page, add the |
amazon-mwaa-user-guide-095 | amazon-mwaa-user-guide.pdf | 95 | • Airflow configuration options • What's next? Version • The sample code on this page can be used with Apache Airflow v1 in Python 3.7. • You can use the code example on this page with Apache Airflow v2 in Python 3.10. Prerequisites To use the sample code on this page, you'll need the following: • An Amazon MWAA environment. Custom plugin to patch PythonVirtualenvOperator 322 Amazon Managed Workflows for Apache Airflow User Guide Permissions • No additional permissions are required to use the code example on this page. Requirements To use the sample code on this page, add the following dependencies to your requirements.txt. To learn more, see Installing Python dependencies. virtualenv Custom plugin sample code Apache Airflow will execute the contents of Python files in the plugins folder at startup. This plugin will patch the built-in PythonVirtualenvOperator during that startup process to make it compatible with Amazon MWAA. The following steps show the sample code for the custom plugin. Apache Airflow v2 1. In your command prompt, navigate to the plugins directory above. For example: cd plugins 2. Copy the contents of the following code sample and save locally as virtual_python_plugin.py. """ Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN Permissions 323 Amazon Managed Workflows for Apache Airflow User Guide CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. """ from airflow.plugins_manager import AirflowPlugin import airflow.utils.python_virtualenv from typing import List def _generate_virtualenv_cmd(tmp_dir: str, python_bin: str, system_site_packages: bool) -> List[str]: cmd = ['python3','/usr/local/airflow/.local/lib/python3.7/site-packages/ virtualenv', tmp_dir] if system_site_packages: cmd.append('--system-site-packages') if python_bin is not None: cmd.append(f'--python={python_bin}') return cmd airflow.utils.python_virtualenv._generate_virtualenv_cmd=_generate_virtualenv_cmd class VirtualPythonPlugin(AirflowPlugin): name = 'virtual_python_plugin' Apache Airflow v1 1. In your command prompt, navigate to the plugins directory above. For example: cd plugins 2. Copy the contents of the following code sample and save locally as virtual_python_plugin.py. from airflow.plugins_manager import AirflowPlugin from airflow.operators.python_operator import PythonVirtualenvOperator def _generate_virtualenv_cmd(self, tmp_dir): cmd = ['python3','/usr/local/airflow/.local/lib/python3.7/site-packages/ virtualenv', tmp_dir] if self.system_site_packages: cmd.append('--system-site-packages') if self.python_version is not None: cmd.append('--python=python{}'.format(self.python_version)) return cmd PythonVirtualenvOperator._generate_virtualenv_cmd=_generate_virtualenv_cmd Custom plugin sample code 324 Amazon Managed Workflows for Apache Airflow User Guide class EnvVarPlugin(AirflowPlugin): name = 'virtual_python_plugin' Plugins.zip The following steps show how to create the plugins.zip. 1. In your command prompt, navigate to the directory containing virtual_python_plugin.py above. For example: cd plugins 2. Zip the contents within your plugins folder. zip plugins.zip virtual_python_plugin.py Code sample The following steps describe how to create the DAG code for the custom plugin. Apache Airflow v2 1. In your command prompt, navigate to the directory where your DAG code is stored. For example: cd dags 2. Copy the contents of the following code sample and save locally as virtualenv_test.py. """ Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so. Plugins.zip 325 Amazon Managed Workflows for Apache Airflow User Guide THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. """ from airflow import DAG from airflow.operators.python import PythonVirtualenvOperator from airflow.utils.dates import days_ago import os os.environ["PATH"] = os.getenv("PATH") + ":/usr/local/airflow/.local/bin" def virtualenv_fn(): import boto3 print("boto3 version ",boto3.__version__) with DAG(dag_id="virtualenv_test", schedule_interval=None, catchup=False, start_date=days_ago(1)) as dag: virtualenv_task = PythonVirtualenvOperator( task_id="virtualenv_task", python_callable=virtualenv_fn, requirements=["boto3>=1.17.43"], system_site_packages=False, dag=dag, ) Apache Airflow v1 1. In your command prompt, navigate to the directory where your DAG code is stored. For example: cd dags 2. Copy the contents of the following code sample and save locally as virtualenv_test.py. """ Copyright Amazon.com, Inc. or |
amazon-mwaa-user-guide-096 | amazon-mwaa-user-guide.pdf | 96 | OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. """ from airflow import DAG from airflow.operators.python import PythonVirtualenvOperator from airflow.utils.dates import days_ago import os os.environ["PATH"] = os.getenv("PATH") + ":/usr/local/airflow/.local/bin" def virtualenv_fn(): import boto3 print("boto3 version ",boto3.__version__) with DAG(dag_id="virtualenv_test", schedule_interval=None, catchup=False, start_date=days_ago(1)) as dag: virtualenv_task = PythonVirtualenvOperator( task_id="virtualenv_task", python_callable=virtualenv_fn, requirements=["boto3>=1.17.43"], system_site_packages=False, dag=dag, ) Apache Airflow v1 1. In your command prompt, navigate to the directory where your DAG code is stored. For example: cd dags 2. Copy the contents of the following code sample and save locally as virtualenv_test.py. """ Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. Code sample 326 Amazon Managed Workflows for Apache Airflow User Guide Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. """ from airflow import DAG from airflow.operators.python_operator import PythonVirtualenvOperator from airflow.utils.dates import days_ago import os os.environ["PATH"] = os.getenv("PATH") + ":/usr/local/airflow/.local/bin" def virtualenv_fn(): import boto3 print("boto3 version ",boto3.__version__) with DAG(dag_id="virtualenv_test", schedule_interval=None, catchup=False, start_date=days_ago(1)) as dag: virtualenv_task = PythonVirtualenvOperator( task_id="virtualenv_task", python_callable=virtualenv_fn, requirements=["boto3>=1.17.43"], system_site_packages=False, dag=dag, ) Airflow configuration options If you're using Apache Airflow v2, add core.lazy_load_plugins : False as an Apache Airflow configuration option. To learn more, see Using configuration options to load plugins in 2. Airflow configuration options 327 Amazon Managed Workflows for Apache Airflow User Guide What's next? • Learn how to upload the requirements.txt file in this example to your Amazon S3 bucket in Installing Python dependencies. • Learn how to upload the DAG code in this example to the dags folder in your Amazon S3 bucket in Adding or updating DAGs. • Learn more about how to upload the plugins.zip file in this example to your Amazon S3 bucket in Installing custom plugins. Invoking DAGs with a Lambda function The following code example uses an AWS Lambda function to get an Apache Airflow CLI token and invoke a directed acyclic graph (DAG) in an Amazon MWAA environment. Topics • Version • Prerequisites • Permissions • Dependencies • Code example Version • You can use the code example on this page with Apache Airflow v2 in Python 3.10. Prerequisites To use this code example, you must: • Use the public network access mode for your Amazon MWAA environment. • Have a Lambda function using the latest Python runtime. What's next? 328 Amazon Managed Workflows for Apache Airflow User Guide Note If the Lambda function and your Amazon MWAA environment are in the same VPC, you can use this code on a private network. For this configuration, the Lambda function's execution role needs permission to call the Amazon Elastic Compute Cloud (Amazon EC2) CreateNetworkInterface API operation. You can provide this permission using the AWSLambdaVPCAccessExecutionRole AWS managed policy. Permissions To use the code example on this page, your Amazon MWAA environment's execution role needs access to perform the airflow:CreateCliToken action. You can provide this permission using the AmazonMWAAAirflowCliAccess AWS managed policy: { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "airflow:CreateCliToken" ], "Resource": "*" } ] } For more information, see Apache Airflow CLI policy: AmazonMWAAAirflowCliAccess. Dependencies • To use this code example with Apache Airflow v2, no additional dependencies are required. The code uses the Apache Airflow v2 base install on your environment. Code example 1. Open the AWS Lambda console at https://console.aws.amazon.com/lambda/. 2. Choose your Lambda function from the Functions list. Permissions 329 Amazon Managed Workflows for Apache Airflow User Guide 3. On the function page, copy the following code and replace the following with the names of your resources: • YOUR_ENVIRONMENT_NAME – The name of your Amazon MWAA environment. • YOUR_DAG_NAME – The name of the DAG that you want to invoke. import boto3 import http.client import base64 import ast mwaa_env_name = 'YOUR_ENVIRONMENT_NAME' dag_name = 'YOUR_DAG_NAME' mwaa_cli_command = 'dags trigger' client = boto3.client('mwaa') def lambda_handler(event, context): # get web token mwaa_cli_token = client.create_cli_token( Name=mwaa_env_name ) conn = http.client.HTTPSConnection(mwaa_cli_token['WebServerHostname']) payload = mwaa_cli_command + " " + dag_name headers = { 'Authorization': 'Bearer ' + mwaa_cli_token['CliToken'], 'Content-Type': 'text/plain' } conn.request("POST", "/aws_mwaa/cli", payload, headers) res = conn.getresponse() data = res.read() dict_str = data.decode("UTF-8") mydata = ast.literal_eval(dict_str) return |
amazon-mwaa-user-guide-097 | amazon-mwaa-user-guide.pdf | 97 | the names of your resources: • YOUR_ENVIRONMENT_NAME – The name of your Amazon MWAA environment. • YOUR_DAG_NAME – The name of the DAG that you want to invoke. import boto3 import http.client import base64 import ast mwaa_env_name = 'YOUR_ENVIRONMENT_NAME' dag_name = 'YOUR_DAG_NAME' mwaa_cli_command = 'dags trigger' client = boto3.client('mwaa') def lambda_handler(event, context): # get web token mwaa_cli_token = client.create_cli_token( Name=mwaa_env_name ) conn = http.client.HTTPSConnection(mwaa_cli_token['WebServerHostname']) payload = mwaa_cli_command + " " + dag_name headers = { 'Authorization': 'Bearer ' + mwaa_cli_token['CliToken'], 'Content-Type': 'text/plain' } conn.request("POST", "/aws_mwaa/cli", payload, headers) res = conn.getresponse() data = res.read() dict_str = data.decode("UTF-8") mydata = ast.literal_eval(dict_str) return base64.b64decode(mydata['stdout']) 4. Choose Deploy. 5. Choose Test to invoke your function using the Lambda console. 6. To verify that your Lambda successfully invoked your DAG, use the Amazon MWAA console to navigate to your environment's Apache Airflow UI, then do the following: a. On the DAGs page, locate your new target DAG in the list of DAGs. Code example 330 Amazon Managed Workflows for Apache Airflow User Guide b. Under Last Run, check the timestamp for the latest DAG run. This timestamp should closely match the latest timestamp for invoke_dag in your other environment. c. Under Recent Tasks, check that the last run was successful. Invoking DAGs in different Amazon MWAA environments The following code example creates an Apache Airflow CLI token. The code then uses a directed acyclic graph (DAG) in one Amazon MWAA environment to invoke a DAG in a different Amazon MWAA environment. Topics • Version • Prerequisites • Permissions • Dependencies • Code example Version • You can use the code example on this page with Apache Airflow v2 in Python 3.10. Prerequisites To use the code example on this page, you need the following: • Two Amazon MWAA environments with public network web server access, including your current environment. • A sample DAG uploaded to your target environment's Amazon Simple Storage Service (Amazon S3) bucket. Invoking DAGs in different environments 331 Amazon Managed Workflows for Apache Airflow User Guide Permissions To use the code example on this page, your environment's execution role must have permission to create an Apache Airflow CLI token. You can attach the AWS managed policy AmazonMWAAAirflowCliAccess to grant this permission. { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "airflow:CreateCliToken" ], "Resource": "*" } ] For more information, see Apache Airflow CLI policy: AmazonMWAAAirflowCliAccess. Dependencies • To use this code example with Apache Airflow v2, no additional dependencies are required. The code uses the Apache Airflow v2 base install on your environment. Code example The following code example assumes that you're using a DAG in your current environment to invoke a DAG in another environment. 1. In your terminal, navigate to the directory where your DAG code is stored. For example: cd dags 2. Copy the content of the following code example and save it locally as invoke_dag.py. Replace the following values with your information. • your-new-environment-name – The name of the other environment where you want to invoke the DAG. Permissions 332 Amazon Managed Workflows for Apache Airflow User Guide • your-target-dag-id – The ID of the DAG in the other environment that you want to invoke. from airflow.decorators import dag, task import boto3 from datetime import datetime, timedelta import os, requests DAG_ID = os.path.basename(__file__).replace(".py", "") @task() def invoke_dag_task(**kwargs): client = boto3.client('mwaa') token = client.create_cli_token(Name='your-new-environment-name') url = f"https://{token['WebServerHostname']}/aws_mwaa/cli" body = 'dags trigger your-target-dag-id' headers = { 'Authorization' : 'Bearer ' + token['CliToken'], 'Content-Type': 'text/plain' } requests.post(url, data=body, headers=headers) @dag( dag_id=DAG_ID, schedule_interval=None, start_date=datetime(2022, 1, 1), dagrun_timeout=timedelta(minutes=60), catchup=False ) def invoke_dag(): t = invoke_dag_task() invoke_dag_test = invoke_dag() 3. Run the following AWS CLI command to copy the DAG to your environment's bucket, then trigger the DAG using the Apache Airflow UI. $ aws s3 cp your-dag.py s3://your-environment-bucket/dags/ 4. If the DAG runs successfully, you'll see output similar to the following in the task logs for invoke_dag_task. Code example 333 Amazon Managed Workflows for Apache Airflow User Guide [2022-01-01, 12:00:00 PDT] {{python.py:152}} INFO - Done. Returned value was: None [2022-01-01, 12:00:00 PDT] {{taskinstance.py:1280}} INFO - Marking task as SUCCESS. dag_id=invoke_dag, task_id=invoke_dag_task, execution_date=20220101T120000, start_date=20220101T120000, end_date=20220101T120000 [2022-01-01, 12:00:00 PDT] {{local_task_job.py:154}} INFO - Task exited with return code 0 [2022-01-01, 12:00:00 PDT] {{local_task_job.py:264}} INFO - 0 downstream tasks scheduled from follow-on schedule check To verify that your DAG was successfully invoked, navigate to the Apache Airflow UI for your new environment, then do the following: a. On the DAGs page, locate your new target DAG in the list of DAGs. b. Under Last Run, check the timestamp for the latest DAG run. This timestamp should closely match the latest timestamp for invoke_dag in your other environment. c. Under Recent Tasks, check that the last run was successful. Using Amazon MWAA with Amazon RDS for Microsoft SQL Server You can use Amazon Managed Workflows for Apache |
amazon-mwaa-user-guide-098 | amazon-mwaa-user-guide.pdf | 98 | 0 downstream tasks scheduled from follow-on schedule check To verify that your DAG was successfully invoked, navigate to the Apache Airflow UI for your new environment, then do the following: a. On the DAGs page, locate your new target DAG in the list of DAGs. b. Under Last Run, check the timestamp for the latest DAG run. This timestamp should closely match the latest timestamp for invoke_dag in your other environment. c. Under Recent Tasks, check that the last run was successful. Using Amazon MWAA with Amazon RDS for Microsoft SQL Server You can use Amazon Managed Workflows for Apache Airflow to connect to an RDS for SQL Server. The following sample code uses DAGs on an Amazon Managed Workflows for Apache Airflow environment to connect to and execute queries on an Amazon RDS for Microsoft SQL Server. Topics • Version • Prerequisites • Dependencies • Apache Airflow v2 connection • Code sample • What's next? Version • The sample code on this page can be used with Apache Airflow v1 in Python 3.7. Amazon RDS server 334 Amazon Managed Workflows for Apache Airflow User Guide • You can use the code example on this page with Apache Airflow v2 in Python 3.10. Prerequisites To use the sample code on this page, you'll need the following: • An Amazon MWAA environment. • Amazon MWAA and the RDS for SQL Server are running in the same Amazon VPC/ • VPC security groups of Amazon MWAA and the server are configured with the following connections: • An inbound rule for the port 1433 open for Amazon RDS in Amazon MWAA's security group • Or an outbound rule for the port of 1433 open from Amazon MWAA to RDS • Apache Airflow Connection for RDS for SQL Server reflects the hostname, port, username and password from the Amazon RDS SQL server database created in previous process. Dependencies To use the sample code in this section, add the following dependency to your requirements.txt. To learn more, see Installing Python dependencies Apache Airflow v2 apache-airflow-providers-microsoft-mssql==1.0.1 apache-airflow-providers-odbc==1.0.1 pymssql==2.2.1 Apache Airflow v1 apache-airflow[mssql]==1.10.12 Apache Airflow v2 connection If you're using a connection in Apache Airflow v2, ensure the Airflow connection object includes the following key-value pairs: 1. Conn Id: mssql_default Prerequisites 335 Amazon Managed Workflows for Apache Airflow User Guide 2. Conn Type: Amazon Web Services 3. Host: YOUR_DB_HOST 4. Schema: 5. Login: admin 6. Password: 7. Port: 1433 8. Extra: Code sample 1. In your command prompt, navigate to the directory where your DAG code is stored. For example: cd dags 2. Copy the contents of the following code sample and save locally as sql-server.py. """ Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. """ import pymssql import logging import sys from airflow import DAG from datetime import datetime from airflow.operators.mssql_operator import MsSqlOperator from airflow.operators.python_operator import PythonOperator Code sample 336 Amazon Managed Workflows for Apache Airflow User Guide default_args = { 'owner': 'aws', 'depends_on_past': False, 'start_date': datetime(2019, 2, 20), 'provide_context': True } dag = DAG( 'mssql_conn_example', default_args=default_args, schedule_interval=None) drop_db = MsSqlOperator( task_id="drop_db", sql="DROP DATABASE IF EXISTS testdb;", mssql_conn_id="mssql_default", autocommit=True, dag=dag ) create_db = MsSqlOperator( task_id="create_db", sql="create database testdb;", mssql_conn_id="mssql_default", autocommit=True, dag=dag ) create_table = MsSqlOperator( task_id="create_table", sql="CREATE TABLE testdb.dbo.pet (name VARCHAR(20), owner VARCHAR(20));", mssql_conn_id="mssql_default", autocommit=True, dag=dag ) insert_into_table = MsSqlOperator( task_id="insert_into_table", sql="INSERT INTO testdb.dbo.pet VALUES ('Olaf', 'Disney');", mssql_conn_id="mssql_default", autocommit=True, dag=dag ) def select_pet(**kwargs): try: Code sample 337 Amazon Managed Workflows for Apache Airflow User Guide conn = pymssql.connect( server='sampledb.<xxxxxx>.<region>.rds.amazonaws.com', user='admin', password='<yoursupersecretpassword>', database='testdb' ) # Create a cursor from the connection cursor = conn.cursor() cursor.execute("SELECT * from testdb.dbo.pet") row = cursor.fetchone() if row: print(row) except: logging.error("Error when creating pymssql database connection: %s", sys.exc_info()[0]) select_query = PythonOperator( task_id='select_query', python_callable=select_pet, dag=dag, ) drop_db >> create_db >> create_table >> insert_into_table >> select_query What's next? • Learn how to upload the requirements.txt file in this example to your Amazon S3 bucket in Installing Python dependencies. • Learn how to upload the DAG code in this example to the dags folder in your Amazon |
amazon-mwaa-user-guide-099 | amazon-mwaa-user-guide.pdf | 99 | Managed Workflows for Apache Airflow User Guide conn = pymssql.connect( server='sampledb.<xxxxxx>.<region>.rds.amazonaws.com', user='admin', password='<yoursupersecretpassword>', database='testdb' ) # Create a cursor from the connection cursor = conn.cursor() cursor.execute("SELECT * from testdb.dbo.pet") row = cursor.fetchone() if row: print(row) except: logging.error("Error when creating pymssql database connection: %s", sys.exc_info()[0]) select_query = PythonOperator( task_id='select_query', python_callable=select_pet, dag=dag, ) drop_db >> create_db >> create_table >> insert_into_table >> select_query What's next? • Learn how to upload the requirements.txt file in this example to your Amazon S3 bucket in Installing Python dependencies. • Learn how to upload the DAG code in this example to the dags folder in your Amazon S3 bucket in Adding or updating DAGs. • Explore example scripts and other pymssql module examples. • Learn more about executing SQL code in a specific Microsoft SQL database using the mssql_operator in the Apache Airflow reference guide. What's next? 338 Amazon Managed Workflows for Apache Airflow User Guide Using Amazon MWAA with Amazon EMR The following code sample demonstrates how to enable an integration using Amazon EMR and Amazon Managed Workflows for Apache Airflow. Topics • Version • Code sample Version • The sample code on this page can be used with Apache Airflow v1 in Python 3.7. Code sample """ Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. """ from airflow import DAG from airflow.providers.amazon.aws.operators.emr import EmrAddStepsOperator from airflow.providers.amazon.aws.sensors.emr import EmrStepSensor from airflow.providers.amazon.aws.operators.emr import EmrCreateJobFlowOperator from airflow.utils.dates import days_ago from datetime import timedelta import os Amazon EMR integration 339 Amazon Managed Workflows for Apache Airflow User Guide DAG_ID = os.path.basename(__file__).replace(".py", "") DEFAULT_ARGS = { 'owner': 'airflow', 'depends_on_past': False, 'email': ['airflow@example.com'], 'email_on_failure': False, 'email_on_retry': False, } SPARK_STEPS = [ { 'Name': 'calculate_pi', 'ActionOnFailure': 'CONTINUE', 'HadoopJarStep': { 'Jar': 'command-runner.jar', 'Args': ['/usr/lib/spark/bin/run-example', 'SparkPi', '10'], }, } ] JOB_FLOW_OVERRIDES = { 'Name': 'my-demo-cluster', 'ReleaseLabel': 'emr-5.30.1', 'Applications': [ { 'Name': 'Spark' }, ], 'Instances': { 'InstanceGroups': [ { 'Name': "Master nodes", 'Market': 'ON_DEMAND', 'InstanceRole': 'MASTER', 'InstanceType': 'm5.xlarge', 'InstanceCount': 1, }, { 'Name': "Slave nodes", 'Market': 'ON_DEMAND', 'InstanceRole': 'CORE', 'InstanceType': 'm5.xlarge', Code sample 340 Amazon Managed Workflows for Apache Airflow User Guide 'InstanceCount': 2, } ], 'KeepJobFlowAliveWhenNoSteps': False, 'TerminationProtected': False, 'Ec2KeyName': 'mykeypair', }, 'VisibleToAllUsers': True, 'JobFlowRole': 'EMR_EC2_DefaultRole', 'ServiceRole': 'EMR_DefaultRole' } with DAG( dag_id=DAG_ID, default_args=DEFAULT_ARGS, dagrun_timeout=timedelta(hours=2), start_date=days_ago(1), schedule_interval='@once', tags=['emr'], ) as dag: cluster_creator = EmrCreateJobFlowOperator( task_id='create_job_flow', job_flow_overrides=JOB_FLOW_OVERRIDES ) step_adder = EmrAddStepsOperator( task_id='add_steps', job_flow_id="{{ task_instance.xcom_pull(task_ids='create_job_flow', key='return_value') }}", aws_conn_id='aws_default', steps=SPARK_STEPS, ) step_checker = EmrStepSensor( task_id='watch_step', job_flow_id="{{ task_instance.xcom_pull('create_job_flow', key='return_value') }}", step_id="{{ task_instance.xcom_pull(task_ids='add_steps', key='return_value')[0] }}", aws_conn_id='aws_default', ) Code sample 341 Amazon Managed Workflows for Apache Airflow User Guide cluster_creator >> step_adder >> step_checker Using Amazon MWAA with Amazon EKS The following sample demonstrates how to use Amazon Managed Workflows for Apache Airflow with Amazon EKS. Topics • Version • Prerequisites • Create a public key for Amazon EC2 • Create the cluster • Create a mwaa namespace • Create a role for the mwaa namespace • Create and attach an IAM role for the Amazon EKS cluster • Create the requirements.txt file • Create an identity mapping for Amazon EKS • Create the kubeconfig • Create a DAG • Add the DAG and kube_config.yaml to the Amazon S3 bucket • Enable and trigger the example Version • The sample code on this page can be used with Apache Airflow v1 in Python 3.7. • You can use the code example on this page with Apache Airflow v2 in Python 3.10. Prerequisites To use the example in this topic, you'll need the following: • An Amazon MWAA environment. Amazon EKS (eksctl) 342 Amazon Managed Workflows for Apache Airflow User Guide • eksctl. To learn more, see Install eksctl. • kubectl. To learn more, see Install and Set Up kubectl. In some case this is installed with eksctl. • An EC2 key pair in the Region where you create your Amazon MWAA environment. To learn more, see Creating or importing a key pair. Note When you use an eksctl command, you can include a --profile to specify a profile other than the default. Create a public key for Amazon EC2 Use the following |
amazon-mwaa-user-guide-100 | amazon-mwaa-user-guide.pdf | 100 | following: • An Amazon MWAA environment. Amazon EKS (eksctl) 342 Amazon Managed Workflows for Apache Airflow User Guide • eksctl. To learn more, see Install eksctl. • kubectl. To learn more, see Install and Set Up kubectl. In some case this is installed with eksctl. • An EC2 key pair in the Region where you create your Amazon MWAA environment. To learn more, see Creating or importing a key pair. Note When you use an eksctl command, you can include a --profile to specify a profile other than the default. Create a public key for Amazon EC2 Use the following command to create a public key from your private key pair. ssh-keygen -y -f myprivatekey.pem > mypublickey.pub To learn more, see Retrieving the public key for your key pair. Create the cluster Use the following command to create the cluster. If you want a custom name for the cluster or to create it in a different Region, replace the name and Region values. You must create the cluster in the same Region where you create the Amazon MWAA environment. Replace the values for the subnets to match the subnets in your Amazon VPC network that you use for Amazon MWAA. Replace the value for the ssh-public-key to match the key you use. You can use an existing key from Amazon EC2 that is in the same Region, or create a new key in the same Region where you create your Amazon MWAA environment. eksctl create cluster \ --name mwaa-eks \ --region us-west-2 \ --version 1.18 \ --nodegroup-name linux-nodes \ --nodes 3 \ --nodes-min 1 \ --nodes-max 4 \ --with-oidc \ --ssh-access \ Create a public key for Amazon EC2 343 Amazon Managed Workflows for Apache Airflow User Guide --ssh-public-key MyPublicKey \ --managed \ --vpc-public-subnets "subnet-11111111111111111, subnet-2222222222222222222" \ --vpc-private-subnets "subnet-33333333333333333, subnet-44444444444444444" It takes some time to complete creating the cluster. Once complete, you can verify that the cluster was created successfully and has the IAM OIDC Provider configured by using the following command: eksctl utils associate-iam-oidc-provider \ --region us-west-2 \ --cluster mwaa-eks \ --approve Create a mwaa namespace After confirming that the cluster was successfully created, use the following command to create a namespace for the pods. kubectl create namespace mwaa Create a role for the mwaa namespace After you create the namespace, create a role and role-binding for an Amazon MWAA user on EKS that can run pods in a the MWAA namespace. If you used a different name for the namespace, replace mwaa in -n mwaa with the name that you used. cat << EOF | kubectl apply -f - -n mwaa kind: Role apiVersion: rbac.authorization.k8s.io/v1 metadata: name: mwaa-role rules: - apiGroups: - "" - "apps" - "batch" - "extensions" resources: - "jobs" - "pods" Create a mwaa namespace 344 Amazon Managed Workflows for Apache Airflow User Guide - "pods/attach" - "pods/exec" - "pods/log" - "pods/portforward" - "secrets" - "services" verbs: - "create" - "delete" - "describe" - "get" - "list" - "patch" - "update" --- kind: RoleBinding apiVersion: rbac.authorization.k8s.io/v1 metadata: name: mwaa-role-binding subjects: - kind: User name: mwaa-service roleRef: kind: Role name: mwaa-role apiGroup: rbac.authorization.k8s.io EOF Confirm that the new role can access the Amazon EKS cluster by running the following command. Be sure to use the correct name if you did not use mwaa: kubectl get pods -n mwaa --as mwaa-service You should see a message returned that says: No resources found in mwaa namespace. Create and attach an IAM role for the Amazon EKS cluster You must create an IAM role and then bind it to the Amazon EKS (k8s) cluster so that it can be used for authentication through IAM. The role is used only to log in to the cluster, and does not have any permissions for the console or API calls. Create and attach an IAM role for the Amazon EKS cluster 345 Amazon Managed Workflows for Apache Airflow User Guide Create a new role for the Amazon MWAA environment using the steps in Amazon MWAA execution role. However, instead of creating and attaching the policies described in that topic, attach the following policy: { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": "airflow:PublishMetrics", "Resource": "arn:aws:airflow:${MWAA_REGION}:${ACCOUNT_NUMBER}:environment/ ${MWAA_ENV_NAME}" }, { "Effect": "Deny", "Action": "s3:ListAllMyBuckets", "Resource": [ "arn:aws:s3:::{MWAA_S3_BUCKET}", "arn:aws:s3:::{MWAA_S3_BUCKET}/*" ] }, { "Effect": "Allow", "Action": [ "s3:GetObject*", "s3:GetBucket*", "s3:List*" ], "Resource": [ "arn:aws:s3:::{MWAA_S3_BUCKET}", "arn:aws:s3:::{MWAA_S3_BUCKET}/*" ] }, { "Effect": "Allow", "Action": [ "logs:CreateLogStream", "logs:CreateLogGroup", "logs:PutLogEvents", "logs:GetLogEvents", "logs:GetLogRecord", "logs:GetLogGroupFields", "logs:GetQueryResults", Create and attach an IAM role for the Amazon EKS cluster 346 Amazon Managed Workflows for Apache Airflow User Guide "logs:DescribeLogGroups" ], "Resource": [ "arn:aws:logs:${MWAA_REGION}:${ACCOUNT_NUMBER}:log-group:airflow- ${MWAA_ENV_NAME}-*" ] }, { "Effect": "Allow", "Action": "cloudwatch:PutMetricData", "Resource": "*" }, { "Effect": "Allow", "Action": [ "sqs:ChangeMessageVisibility", "sqs:DeleteMessage", "sqs:GetQueueAttributes", "sqs:GetQueueUrl", "sqs:ReceiveMessage", "sqs:SendMessage" ], "Resource": "arn:aws:sqs:${MWAA_REGION}:*:airflow-celery-*" }, { "Effect": "Allow", "Action": [ "kms:Decrypt", "kms:DescribeKey", "kms:GenerateDataKey*", "kms:Encrypt" ], "NotResource": "arn:aws:kms:*:${ACCOUNT_NUMBER}:key/*", "Condition": { "StringLike": { |
amazon-mwaa-user-guide-101 | amazon-mwaa-user-guide.pdf | 101 | "Resource": [ "arn:aws:s3:::{MWAA_S3_BUCKET}", "arn:aws:s3:::{MWAA_S3_BUCKET}/*" ] }, { "Effect": "Allow", "Action": [ "s3:GetObject*", "s3:GetBucket*", "s3:List*" ], "Resource": [ "arn:aws:s3:::{MWAA_S3_BUCKET}", "arn:aws:s3:::{MWAA_S3_BUCKET}/*" ] }, { "Effect": "Allow", "Action": [ "logs:CreateLogStream", "logs:CreateLogGroup", "logs:PutLogEvents", "logs:GetLogEvents", "logs:GetLogRecord", "logs:GetLogGroupFields", "logs:GetQueryResults", Create and attach an IAM role for the Amazon EKS cluster 346 Amazon Managed Workflows for Apache Airflow User Guide "logs:DescribeLogGroups" ], "Resource": [ "arn:aws:logs:${MWAA_REGION}:${ACCOUNT_NUMBER}:log-group:airflow- ${MWAA_ENV_NAME}-*" ] }, { "Effect": "Allow", "Action": "cloudwatch:PutMetricData", "Resource": "*" }, { "Effect": "Allow", "Action": [ "sqs:ChangeMessageVisibility", "sqs:DeleteMessage", "sqs:GetQueueAttributes", "sqs:GetQueueUrl", "sqs:ReceiveMessage", "sqs:SendMessage" ], "Resource": "arn:aws:sqs:${MWAA_REGION}:*:airflow-celery-*" }, { "Effect": "Allow", "Action": [ "kms:Decrypt", "kms:DescribeKey", "kms:GenerateDataKey*", "kms:Encrypt" ], "NotResource": "arn:aws:kms:*:${ACCOUNT_NUMBER}:key/*", "Condition": { "StringLike": { "kms:ViaService": [ "sqs.${MWAA_REGION}.amazonaws.com" ] } } }, { "Effect": "Allow", "Action": [ Create and attach an IAM role for the Amazon EKS cluster 347 Amazon Managed Workflows for Apache Airflow User Guide "eks:DescribeCluster" ], "Resource": "arn:aws:eks:${MWAA_REGION}:${ACCOUNT_NUMBER}:cluster/ ${EKS_CLUSTER_NAME}" } ] } After you create role, edit your Amazon MWAA environment to use the role you created as the execution role for the environment. To change the role, edit the environment to use. You select the execution role under Permissions. Known issues: • There is a known issue with role ARNs with subpaths not being able to authenticate with Amazon EKS. The workaround for this is to create the service role manually rather than using the one created by Amazon MWAA itself. To learn more, see Roles with paths do not work when the path is included in their ARN in the aws-auth configmap • If Amazon MWAA service listing is not available in IAM you need to choose an alternate service policy, such as Amazon EC2, and then update the role’s trust policy to match the following: { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Principal": { "Service": [ "airflow-env.amazonaws.com", "airflow.amazonaws.com" ] }, "Action": "sts:AssumeRole" } ] } To learn more, see How to use trust policies with IAM roles. Create and attach an IAM role for the Amazon EKS cluster 348 Amazon Managed Workflows for Apache Airflow User Guide Create the requirements.txt file To use the sample code in this section, ensure you've added one of the following database options to your requirements.txt. To learn more, see Installing Python dependencies. Apache Airflow v2 kubernetes apache-airflow[cncf.kubernetes]==3.0.0 Apache Airflow v1 awscli kubernetes==12.0.1 Create an identity mapping for Amazon EKS Use the ARN for the role you created in the following command to create an identity mapping for Amazon EKS. Change the Region your-region to the Region where you created the environment. Replace the ARN for the role, and finally, replace mwaa-execution-role with your environment's execution role. eksctl create iamidentitymapping \ --region your-region \ --cluster mwaa-eks \ --arn arn:aws:iam::111222333444:role/mwaa-execution-role \ --username mwaa-service Create the kubeconfig Use the following command to create the kubeconfig: aws eks update-kubeconfig \ --region us-west-2 \ --kubeconfig ./kube_config.yaml \ --name mwaa-eks \ --alias aws Create the requirements.txt file 349 Amazon Managed Workflows for Apache Airflow User Guide If you used a specific profile when you ran update-kubeconfig you need to remove the env: section added to the kube_config.yaml file so that it works correctly with Amazon MWAA. To do so, delete the following from the file and then save it: env: - name: AWS_PROFILE value: profile_name Create a DAG Use the following code example to create a Python file, such as mwaa_pod_example.py for the DAG. Apache Airflow v2 """ Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. """ from airflow import DAG from datetime import datetime from airflow.providers.cncf.kubernetes.operators.kubernetes_pod import KubernetesPodOperator default_args = { 'owner': 'aws', 'depends_on_past': False, 'start_date': datetime(2019, 2, 20), 'provide_context': True } dag = DAG( Create a DAG 350 Amazon Managed Workflows for Apache Airflow User Guide 'kubernetes_pod_example', default_args=default_args, schedule_interval=None) #use a kube_config stored in s3 dags folder for now kube_config_path = '/usr/local/airflow/dags/kube_config.yaml' podRun = KubernetesPodOperator( namespace="mwaa", image="ubuntu:18.04", cmds=["bash"], arguments=["-c", "ls"], labels={"foo": "bar"}, name="mwaa-pod-test", task_id="pod-task", get_logs=True, dag=dag, is_delete_operator_pod=False, config_file=kube_config_path, in_cluster=False, cluster_context='aws' ) Apache Airflow v1 """ Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of |
amazon-mwaa-user-guide-102 | amazon-mwaa-user-guide.pdf | 102 | DAG from datetime import datetime from airflow.providers.cncf.kubernetes.operators.kubernetes_pod import KubernetesPodOperator default_args = { 'owner': 'aws', 'depends_on_past': False, 'start_date': datetime(2019, 2, 20), 'provide_context': True } dag = DAG( Create a DAG 350 Amazon Managed Workflows for Apache Airflow User Guide 'kubernetes_pod_example', default_args=default_args, schedule_interval=None) #use a kube_config stored in s3 dags folder for now kube_config_path = '/usr/local/airflow/dags/kube_config.yaml' podRun = KubernetesPodOperator( namespace="mwaa", image="ubuntu:18.04", cmds=["bash"], arguments=["-c", "ls"], labels={"foo": "bar"}, name="mwaa-pod-test", task_id="pod-task", get_logs=True, dag=dag, is_delete_operator_pod=False, config_file=kube_config_path, in_cluster=False, cluster_context='aws' ) Apache Airflow v1 """ Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. """ from airflow import DAG from datetime import datetime from airflow.contrib.operators.kubernetes_pod_operator import KubernetesPodOperator default_args = { 'owner': 'aws', Create a DAG 351 Amazon Managed Workflows for Apache Airflow User Guide 'depends_on_past': False, 'start_date': datetime(2019, 2, 20), 'provide_context': True } dag = DAG( 'kubernetes_pod_example', default_args=default_args, schedule_interval=None) #use a kube_config stored in s3 dags folder for now kube_config_path = '/usr/local/airflow/dags/kube_config.yaml' podRun = KubernetesPodOperator( namespace="mwaa", image="ubuntu:18.04", cmds=["bash"], arguments=["-c", "ls"], labels={"foo": "bar"}, name="mwaa-pod-test", task_id="pod-task", get_logs=True, dag=dag, is_delete_operator_pod=False, config_file=kube_config_path, in_cluster=False, cluster_context='aws' ) Add the DAG and kube_config.yaml to the Amazon S3 bucket Put the DAG you created and the kube_config.yaml file into the Amazon S3 bucket for the Amazon MWAA environment. You can put files into your bucket using either the Amazon S3 console or the AWS Command Line Interface. Enable and trigger the example In Apache Airflow, enable the example and then trigger it. After it runs and completes successfully, use the following command to verify the pod: kubectl get pods -n mwaa Add the DAG and kube_config.yaml to the Amazon S3 bucket 352 Amazon Managed Workflows for Apache Airflow User Guide You should see output similar to the following: NAME READY STATUS RESTARTS AGE mwaa-pod-test-aa11bb22cc3344445555666677778888 0/1 Completed 0 2m23s You can then verify the output of the pod with the following command. Replace the name value with the value returned from the previous command: kubectl logs -n mwaa mwaa-pod-test-aa11bb22cc3344445555666677778888 Connecting to Amazon ECS using the ECSOperator The topic describes how you can use the ECSOperator to connect to an Amazon Elastic Container Service (Amazon ECS) container from Amazon MWAA. In the following steps, you'll add the required permissions to your environment's execution role, use a AWS CloudFormation template to create an Amazon ECS Fargate cluster, and finally create and upload a DAG that connects to your new cluster. Topics • Version • Prerequisites • Permissions • Create an Amazon ECS cluster • Code sample Version • You can use the code example on this page with Apache Airflow v2 in Python 3.10. Prerequisites To use the sample code on this page, you'll need the following: • An Amazon MWAA environment. Using the ECSOperator 353 Amazon Managed Workflows for Apache Airflow User Guide Permissions • The execution role for your environment needs permission to run tasks in Amazon ECS. You can either attach the AmazonECS_FullAccess AWS-managed policy to your execution role, or create and attach the following policy to your execution role. { "Version": "2012-10-17", "Statement": [ { "Sid": "VisualEditor0", "Effect": "Allow", "Action": [ "ecs:RunTask", "ecs:DescribeTasks" ], "Resource": "*" }, { "Action": "iam:PassRole", "Effect": "Allow", "Resource": [ "*" ], "Condition": { "StringLike": { "iam:PassedToService": "ecs-tasks.amazonaws.com" } } } ] } • In addition to adding the required premissions to run tasks in Amazon ECS, you must also modify the CloudWatch Logs policy statement in your Amazon MWAA execution role to allow access to the Amazon ECS task log group as shown in the following. The Amazon ECS log group is created by the AWS CloudFormation template in the section called “Create an Amazon ECS cluster”. { "Effect": "Allow", "Action": [ "logs:CreateLogStream", Permissions 354 Amazon Managed Workflows for Apache Airflow User Guide "logs:CreateLogGroup", "logs:PutLogEvents", "logs:GetLogEvents", "logs:GetLogRecord", "logs:GetLogGroupFields", "logs:GetQueryResults" ], "Resource": [ "arn:aws:logs:region:account-id:log-group:airflow-environment-name-*", "arn:aws:logs:*:*:log-group:ecs-mwaa-group:*" ] } For more information about the Amazon MWAA execution role, and how to attach a policy, see Execution role. Create an Amazon ECS cluster Using the following AWS CloudFormation template, you will build an Amazon ECS Fargate cluster to use with your Amazon MWAA workflow. For more information, |
amazon-mwaa-user-guide-103 | amazon-mwaa-user-guide.pdf | 103 | in the following. The Amazon ECS log group is created by the AWS CloudFormation template in the section called “Create an Amazon ECS cluster”. { "Effect": "Allow", "Action": [ "logs:CreateLogStream", Permissions 354 Amazon Managed Workflows for Apache Airflow User Guide "logs:CreateLogGroup", "logs:PutLogEvents", "logs:GetLogEvents", "logs:GetLogRecord", "logs:GetLogGroupFields", "logs:GetQueryResults" ], "Resource": [ "arn:aws:logs:region:account-id:log-group:airflow-environment-name-*", "arn:aws:logs:*:*:log-group:ecs-mwaa-group:*" ] } For more information about the Amazon MWAA execution role, and how to attach a policy, see Execution role. Create an Amazon ECS cluster Using the following AWS CloudFormation template, you will build an Amazon ECS Fargate cluster to use with your Amazon MWAA workflow. For more information, see Creating a task definition in the Amazon Elastic Container Service Developer Guide. 1. Create a JSON file with the following code and save it as ecs-mwaa-cfn.json. { "AWSTemplateFormatVersion": "2010-09-09", "Description": "This template deploys an ECS Fargate cluster with an Amazon Linux image as a test for MWAA.", "Parameters": { "VpcId": { "Type": "AWS::EC2::VPC::Id", "Description": "Select a VPC that allows instances access to ECR, as used with MWAA." }, "SubnetIds": { "Type": "List<AWS::EC2::Subnet::Id>", "Description": "Select at two private subnets in your selected VPC, as used with MWAA." }, "SecurityGroups": { "Type": "List<AWS::EC2::SecurityGroup::Id>", Create an Amazon ECS cluster 355 Amazon Managed Workflows for Apache Airflow User Guide "Description": "Select at least one security group in your selected VPC, as used with MWAA." } }, "Resources": { "Cluster": { "Type": "AWS::ECS::Cluster", "Properties": { "ClusterName": { "Fn::Sub": "${AWS::StackName}-cluster" } } }, "LogGroup": { "Type": "AWS::Logs::LogGroup", "Properties": { "LogGroupName": { "Ref": "AWS::StackName" }, "RetentionInDays": 30 } }, "ExecutionRole": { "Type": "AWS::IAM::Role", "Properties": { "AssumeRolePolicyDocument": { "Statement": [ { "Effect": "Allow", "Principal": { "Service": "ecs-tasks.amazonaws.com" }, "Action": "sts:AssumeRole" } ] }, "ManagedPolicyArns": [ "arn:aws:iam::aws:policy/service-role/ AmazonECSTaskExecutionRolePolicy" ] } }, "TaskDefinition": { "Type": "AWS::ECS::TaskDefinition", Create an Amazon ECS cluster 356 Amazon Managed Workflows for Apache Airflow User Guide "Properties": { "Family": { "Fn::Sub": "${AWS::StackName}-task" }, "Cpu": 2048, "Memory": 4096, "NetworkMode": "awsvpc", "ExecutionRoleArn": { "Ref": "ExecutionRole" }, "ContainerDefinitions": [ { "Name": { "Fn::Sub": "${AWS::StackName}-container" }, "Image": "137112412989.dkr.ecr.us-east-1.amazonaws.com/ amazonlinux:latest", "PortMappings": [ { "Protocol": "tcp", "ContainerPort": 8080, "HostPort": 8080 } ], "LogConfiguration": { "LogDriver": "awslogs", "Options": { "awslogs-region": { "Ref": "AWS::Region" }, "awslogs-group": { "Ref": "LogGroup" }, "awslogs-stream-prefix": "ecs" } } } ], "RequiresCompatibilities": [ "FARGATE" ] } }, "Service": { Create an Amazon ECS cluster 357 Amazon Managed Workflows for Apache Airflow User Guide "Type": "AWS::ECS::Service", "Properties": { "ServiceName": { "Fn::Sub": "${AWS::StackName}-service" }, "Cluster": { "Ref": "Cluster" }, "TaskDefinition": { "Ref": "TaskDefinition" }, "DesiredCount": 1, "LaunchType": "FARGATE", "PlatformVersion": "1.3.0", "NetworkConfiguration": { "AwsvpcConfiguration": { "AssignPublicIp": "ENABLED", "Subnets": { "Ref": "SubnetIds" }, "SecurityGroups": { "Ref": "SecurityGroups" } } } } } } } 2. In your command prompt, use the following AWS CLI command to create a new stack. You must replace the values SecurityGroups and SubnetIds with values for your Amazon MWAA environment's security groups and subnets. $ aws cloudformation create-stack \ --stack-name my-ecs-stack --template-body file://ecs-mwaa-cfn.json \ --parameters ParameterKey=SecurityGroups,ParameterValue=your-mwaa-security-group \ ParameterKey=SubnetIds,ParameterValue=your-mwaa-subnet-1\\,your-mwaa-subnet-1 \ --capabilities CAPABILITY_IAM Create an Amazon ECS cluster 358 Amazon Managed Workflows for Apache Airflow User Guide Alternatively, you can use the following shell script. The script retrieves the required values for your environment's security groups, and subnets using the get-environment AWS CLI command, then creates the stack accordingly. To run the script, do the following. a. Copy, and save the script as ecs-stack-helper.sh in the same directory as your AWS CloudFormation template. #!/bin/bash joinByString() { local separator="$1" shift local first="$1" shift printf "%s" "$first" "${@/#/$separator}" } response=$(aws mwaa get-environment --name $1) securityGroupId=$(echo "$response" | jq -r '.Environment.NetworkConfiguration.SecurityGroupIds[]') subnetIds=$(joinByString '\,' $(echo "$response" | jq -r '.Environment.NetworkConfiguration.SubnetIds[]')) aws cloudformation create-stack --stack-name $2 --template-body file://ecs- cfn.json \ --parameters ParameterKey=SecurityGroups,ParameterValue=$securityGroupId \ ParameterKey=SubnetIds,ParameterValue=$subnetIds \ --capabilities CAPABILITY_IAM b. Run the script using the following commands. Replace environment-name and stack- name with your information. $ chmod +x ecs-stack-helper.sh $ ./ecs-stack-helper.bash environment-name stack-name If successful, you'll see the following output displaying your new AWS CloudFormation stack ID. { Create an Amazon ECS cluster 359 Amazon Managed Workflows for Apache Airflow User Guide "StackId": "arn:aws:cloudformation:us-west-2:123456789012:stack/my-ecs- stack/123456e7-8ab9-01cd-b2fb-36cce63786c9" } After your AWS CloudFormation stack is completed and AWS has provisioned your Amazon ECS resources, you're ready to create and upload your DAG. Code sample 1. Open a command prompt, and navigate to the directory where your DAG code is stored. For example: cd dags 2. Copy the contents of the following code sample and save locally as mwaa-ecs-operator.py, then upload your new DAG to Amazon S3. from http import client from airflow import DAG from airflow.providers.amazon.aws.operators.ecs import ECSOperator from airflow.utils.dates import days_ago import boto3 CLUSTER_NAME="mwaa-ecs-test-cluster" #Replace value for CLUSTER_NAME with your information. CONTAINER_NAME="mwaa-ecs-test-container" #Replace value for CONTAINER_NAME with your information. LAUNCH_TYPE="FARGATE" with DAG( dag_id = "ecs_fargate_dag", schedule_interval=None, catchup=False, start_date=days_ago(1) ) as dag: client=boto3.client('ecs') services=client.list_services(cluster=CLUSTER_NAME,launchType=LAUNCH_TYPE) service=client.describe_services(cluster=CLUSTER_NAME,services=services['serviceArns']) ecs_operator_task = ECSOperator( Code sample 360 Amazon Managed Workflows for Apache Airflow User Guide task_id = "ecs_operator_task", dag=dag, cluster=CLUSTER_NAME, task_definition=service['services'][0]['taskDefinition'], launch_type=LAUNCH_TYPE, overrides={ "containerOverrides":[ { |
amazon-mwaa-user-guide-104 | amazon-mwaa-user-guide.pdf | 104 | stored. For example: cd dags 2. Copy the contents of the following code sample and save locally as mwaa-ecs-operator.py, then upload your new DAG to Amazon S3. from http import client from airflow import DAG from airflow.providers.amazon.aws.operators.ecs import ECSOperator from airflow.utils.dates import days_ago import boto3 CLUSTER_NAME="mwaa-ecs-test-cluster" #Replace value for CLUSTER_NAME with your information. CONTAINER_NAME="mwaa-ecs-test-container" #Replace value for CONTAINER_NAME with your information. LAUNCH_TYPE="FARGATE" with DAG( dag_id = "ecs_fargate_dag", schedule_interval=None, catchup=False, start_date=days_ago(1) ) as dag: client=boto3.client('ecs') services=client.list_services(cluster=CLUSTER_NAME,launchType=LAUNCH_TYPE) service=client.describe_services(cluster=CLUSTER_NAME,services=services['serviceArns']) ecs_operator_task = ECSOperator( Code sample 360 Amazon Managed Workflows for Apache Airflow User Guide task_id = "ecs_operator_task", dag=dag, cluster=CLUSTER_NAME, task_definition=service['services'][0]['taskDefinition'], launch_type=LAUNCH_TYPE, overrides={ "containerOverrides":[ { "name":CONTAINER_NAME, "command":["ls", "-l", "/"], }, ], }, network_configuration=service['services'][0]['networkConfiguration'], awslogs_group="mwaa-ecs-zero", awslogs_stream_prefix=f"ecs/{CONTAINER_NAME}", ) Note In the example DAG, for awslogs_group, you might need to modify the log group with the name for your Amazon ECS task log group. The example assumes a log group named mwaa-ecs-zero. For awslogs_stream_prefix, use the Amazon ECS task log stream prefix. The example assumes a log stream prefix, ecs. 3. Run the following AWS CLI command to copy the DAG to your environment's bucket, then trigger the DAG using the Apache Airflow UI. $ aws s3 cp your-dag.py s3://your-environment-bucket/dags/ 4. If successful, you'll see output similar to the following in the task logs for ecs_operator_task in the ecs_fargate_dag DAG: [2022-01-01, 12:00:00 UTC] {{ecs.py:300}} INFO - Running ECS Task - Task definition: arn:aws:ecs:us-west-2:123456789012:task-definition/mwaa-ecs-test- task:1 - on cluster mwaa-ecs-test-cluster [2022-01-01, 12:00:00 UTC] {{ecs-operator-test.py:302}} INFO - ECSOperator overrides: {'containerOverrides': [{'name': 'mwaa-ecs-test-container', 'command': ['ls', '-l', '/']}]} Code sample 361 Amazon Managed Workflows for Apache Airflow User Guide . . . [2022-01-01, 12:00:00 UTC] {{ecs.py:379}} INFO - ECS task ID is: e012340b5e1b43c6a757cf012c635935 [2022-01-01, 12:00:00 UTC] {{ecs.py:313}} INFO - Starting ECS Task Log Fetcher [2022-01-01, 12:00:00 UTC] {{ecs.py:119}} INFO - [2022-07-19, 17:54:03 UTC] total 52 [2022-01-01, 12:00:00 UTC] {{ecs.py:119}} INFO - [2022-07-19, 17:54:03 UTC] lrwxrwxrwx 1 root root 7 Jun 13 18:51 bin -> usr/bin [2022-01-01, 12:00:00 UTC] {{ecs.py:119}} INFO - [2022-07-19, 17:54:03 UTC] dr-xr- xr-x 2 root root 4096 Apr 9 2019 boot [2022-01-01, 12:00:00 UTC] {{ecs.py:119}} INFO - [2022-07-19, 17:54:03 UTC] drwxr- xr-x 5 root root 340 Jul 19 17:54 dev [2022-01-01, 12:00:00 UTC] {{ecs.py:119}} INFO - [2022-07-19, 17:54:03 UTC] drwxr- xr-x 1 root root 4096 Jul 19 17:54 etc [2022-01-01, 12:00:00 UTC] {{ecs.py:119}} INFO - [2022-07-19, 17:54:03 UTC] drwxr- xr-x 2 root root 4096 Apr 9 2019 home [2022-01-01, 12:00:00 UTC] {{ecs.py:119}} INFO - [2022-07-19, 17:54:03 UTC] lrwxrwxrwx 1 root root 7 Jun 13 18:51 lib -> usr/lib [2022-01-01, 12:00:00 UTC] {{ecs.py:119}} INFO - [2022-07-19, 17:54:03 UTC] lrwxrwxrwx 1 root root 9 Jun 13 18:51 lib64 -> usr/lib64 [2022-01-01, 12:00:00 UTC] {{ecs.py:119}} INFO - [2022-07-19, 17:54:03 UTC] drwxr- xr-x 2 root root 4096 Jun 13 18:51 local [2022-01-01, 12:00:00 UTC] {{ecs.py:119}} INFO - [2022-07-19, 17:54:03 UTC] drwxr- xr-x 2 root root 4096 Apr 9 2019 media [2022-01-01, 12:00:00 UTC] {{ecs.py:119}} INFO - [2022-07-19, 17:54:03 UTC] drwxr- xr-x 2 root root 4096 Apr 9 2019 mnt [2022-01-01, 12:00:00 UTC] {{ecs.py:119}} INFO - [2022-07-19, 17:54:03 UTC] drwxr- xr-x 2 root root 4096 Apr 9 2019 opt [2022-01-01, 12:00:00 UTC] {{ecs.py:119}} INFO - [2022-07-19, 17:54:03 UTC] dr-xr- xr-x 103 root root 0 Jul 19 17:54 proc [2022-01-01, 12:00:00 UTC] {{ecs.py:119}} INFO - [2022-07-19, 17:54:03 UTC] dr-xr- x-\-\- 2 root root 4096 Apr 9 2019 root [2022-01-01, 12:00:00 UTC] {{ecs.py:119}} INFO - [2022-07-19, 17:54:03 UTC] drwxr- xr-x 2 root root 4096 Jun 13 18:52 run [2022-01-01, 12:00:00 UTC] {{ecs.py:119}} INFO - [2022-07-19, 17:54:03 UTC] lrwxrwxrwx 1 root root 8 Jun 13 18:51 sbin -> usr/sbin [2022-01-01, 12:00:00 UTC] {{ecs.py:119}} INFO - [2022-07-19, 17:54:03 UTC] drwxr- xr-x 2 root root 4096 Apr 9 2019 srv [2022-01-01, 12:00:00 UTC] {{ecs.py:119}} INFO - [2022-07-19, 17:54:03 UTC] dr-xr- xr-x 13 root root 0 Jul 19 17:54 sys [2022-01-01, 12:00:00 UTC] {{ecs.py:119}} INFO - [2022-07-19, 17:54:03 UTC] drwxrwxrwt 2 root root 4096 Jun 13 18:51 tmp Code sample 362 Amazon Managed Workflows for Apache Airflow User Guide [2022-01-01, 12:00:00 UTC] {{ecs.py:119}} INFO - [2022-07-19, 17:54:03 UTC] drwxr- xr-x 13 root root 4096 Jun 13 18:51 usr [2022-01-01, 12:00:00 UTC] {{ecs.py:119}} INFO - [2022-07-19, 17:54:03 UTC] drwxr- xr-x 18 root root 4096 Jun 13 18:52 var . . . [2022-01-01, 12:00:00 UTC] {{ecs.py:328}} INFO - ECS Task has been successfully executed Using dbt with Amazon MWAA This topic demonstrates how you can use dbt and Postgres with Amazon MWAA. In the following steps, you'll add the required dependencies to your requirements.txt, and upload a sample dbt project to your environment's Amazon S3 bucket. Then, you'll use a sample DAG to verify that Amazon MWAA has installed the dependencies, and finally use the BashOperator to run the dbt project. Topics • Version • Prerequisites • Dependencies • Upload a dbt project to Amazon S3 • Use a DAG to verify dbt dependency installation • Use a DAG to run a dbt |
amazon-mwaa-user-guide-105 | amazon-mwaa-user-guide.pdf | 105 | executed Using dbt with Amazon MWAA This topic demonstrates how you can use dbt and Postgres with Amazon MWAA. In the following steps, you'll add the required dependencies to your requirements.txt, and upload a sample dbt project to your environment's Amazon S3 bucket. Then, you'll use a sample DAG to verify that Amazon MWAA has installed the dependencies, and finally use the BashOperator to run the dbt project. Topics • Version • Prerequisites • Dependencies • Upload a dbt project to Amazon S3 • Use a DAG to verify dbt dependency installation • Use a DAG to run a dbt project Version • You can use the code example on this page with Apache Airflow v2 in Python 3.10. Prerequisites Before you can complete the following steps, you'll need the following: Using dbt with Amazon MWAA 363 Amazon Managed Workflows for Apache Airflow User Guide • An Amazon MWAA environment using Apache Airflow v2.2.2. This sample was written, and tested with v2.2.2. You might need to modify the sample to use with other Apache Airflow versions. • A sample dbt project. To get started using dbt with Amazon MWAA, you can create a fork and clone the dbt starter project from the dbt-labs GitHub repository. Dependencies To use Amazon MWAA with dbt, add the following startup script to your environment. To learn more, see Using a startup script with Amazon MWAA. #!/bin/bash if [[ "${MWAA_AIRFLOW_COMPONENT}" != "worker" ]] then exit 0 fi echo "------------------------------" echo "Installing virtual Python env" echo "------------------------------" pip3 install --upgrade pip echo "Current Python version:" python3 --version echo "..." sudo pip3 install --user virtualenv sudo mkdir python3-virtualenv cd python3-virtualenv sudo python3 -m venv dbt-env sudo chmod -R 777 * echo "------------------------------" echo "Activating venv in" $DBT_ENV_PATH echo "------------------------------" source dbt-env/bin/activate pip3 list Dependencies 364 Amazon Managed Workflows for Apache Airflow User Guide echo "------------------------------" echo "Installing libraries..." echo "------------------------------" # do not use sudo, as it will install outside the venv pip3 install dbt-redshift==1.6.1 dbt-postgres==1.6.1 echo "------------------------------" echo "Venv libraries..." echo "------------------------------" pip3 list dbt --version echo "------------------------------" echo "Deactivating venv..." echo "------------------------------" deactivate In the following sections, you'll upload your dbt project directory to Amazon S3 and run a DAG that validates whether Amazon MWAA has successfully installed the required dbt dependencies. Upload a dbt project to Amazon S3 To be able to use a dbt project with your Amazon MWAA environment, you can upload the entire project directory to your environment's dags folder. When the environment updates, Amazon MWAA downloads the dbt directory to the local usr/local/airflow/dags/ folder. To upload a dbt project to Amazon S3 1. Navigate to the directory where you cloned the dbt starter project. 2. Run the following Amazon S3 AWS CLI command to recursively copy the content of the project to your environment's dags folder using the --recursive parameter. The command creates a sub-directory called dbt that you can use for all of your dbt projects. If the sub-directory already exists, the project files are copied into the existing directory, and a new directory is not created. The command also creates a sub-directory within the dbt directory for this specific starter project. Upload a dbt project to Amazon S3 365 Amazon Managed Workflows for Apache Airflow User Guide $ aws s3 cp dbt-starter-project s3://mwaa-bucket/dags/dbt/dbt-starter-project -- recursive You can use different names for project sub-directories to organize multiple dbt projects within the parent dbt directory. Use a DAG to verify dbt dependency installation The following DAG uses a BashOperator and a bash command to verify whether Amazon MWAA has successfully installed the dbt dependencies specified in requirements.txt. from airflow import DAG from airflow.operators.bash_operator import BashOperator from airflow.utils.dates import days_ago with DAG(dag_id="dbt-installation-test", schedule_interval=None, catchup=False, start_date=days_ago(1)) as dag: cli_command = BashOperator( task_id="bash_command", bash_command=""/usr/local/airflow/python3-virtualenv/dbt-env/bin/dbt -- version"" ) Do the following to view task logs and verify that dbt and its dependencies have been installed. 1. Navigate to the Amazon MWAA console, then choose Open Airflow UI from the list of available environments. 2. On the Apache Airflow UI, find the dbt-installation-test DAG from the list, then choose the date under the Last Run column to open the last successful task. 3. Using Graph View, choose the bash_command task to open the task instance details. 4. Choose Log to open the task logs, then verify that the logs successfully list the dbt version we specified in requirements.txt. Use a DAG to run a dbt project The following DAG uses a BashOperator to copy the dbt projects you uploaded to Amazon S3 from the local usr/local/airflow/dags/ directory to the write-accessible /tmp directory, Use a DAG to verify dbt dependency installation 366 Amazon Managed Workflows for Apache Airflow User Guide then runs the dbt project. The bash commands assume a starter dbt project titled dbt-starter- project. Modify the directory name according to the name of your |
amazon-mwaa-user-guide-106 | amazon-mwaa-user-guide.pdf | 106 | details. 4. Choose Log to open the task logs, then verify that the logs successfully list the dbt version we specified in requirements.txt. Use a DAG to run a dbt project The following DAG uses a BashOperator to copy the dbt projects you uploaded to Amazon S3 from the local usr/local/airflow/dags/ directory to the write-accessible /tmp directory, Use a DAG to verify dbt dependency installation 366 Amazon Managed Workflows for Apache Airflow User Guide then runs the dbt project. The bash commands assume a starter dbt project titled dbt-starter- project. Modify the directory name according to the name of your project directory. from airflow import DAG from airflow.operators.bash_operator import BashOperator from airflow.utils.dates import days_ago import os DAG_ID = os.path.basename(__file__).replace(".py", "") # assumes all files are in a subfolder of DAGs called dbt with DAG(dag_id=DAG_ID, schedule_interval=None, catchup=False, start_date=days_ago(1)) as dag: cli_command = BashOperator( task_id="bash_command", bash_command="source /usr/local/airflow/python3-virtualenv/dbt-env/bin/ activate;\ cp -R /usr/local/airflow/dags/dbt /tmp;\ echo 'listing project files:';\ ls -R /tmp;\ cd /tmp/dbt/mwaa_dbt_test_project;\ /usr/local/airflow/python3-virtualenv/dbt-env/bin/dbt run --project-dir /tmp/dbt/ mwaa_dbt_test_project --profiles-dir ..;\ cat /tmp/dbt_logs/dbt.log;\ rm -rf /tmp/dbt/mwaa_dbt_test_project" ) AWS blogs and tutorials • Working with Amazon EKS and Amazon MWAA for Apache Airflow v2.x AWS blogs and tutorials 367 Amazon Managed Workflows for Apache Airflow User Guide Best practices for Amazon Managed Workflows for Apache Airflow This guide describes the best practices we recommend when using Amazon Managed Workflows for Apache Airflow. Topics • Performance tuning for Apache Airflow on Amazon MWAA • Managing Python dependencies in requirements.txt Performance tuning for Apache Airflow on Amazon MWAA This topic describes how to tune the performance of an Amazon Managed Workflows for Apache Airflow environment using Using Apache Airflow configuration options on Amazon MWAA. Contents • Adding an Apache Airflow configuration option • Apache Airflow scheduler • Parameters • Limits • DAG folders • Parameters • DAG files • Parameters • Tasks • Parameters Adding an Apache Airflow configuration option The following procedure walks you through the steps of adding an Airflow configuration option to your environment. 1. Open the Environments page on the Amazon MWAA console. Performance tuning for Apache Airflow 368 Amazon Managed Workflows for Apache Airflow 2. Choose an environment. 3. Choose Edit. 4. Choose Next. User Guide 5. Choose Add custom configuration in the Airflow configuration options pane. 6. Choose a configuration from the dropdown list and enter a value, or type a custom configuration and enter a value. 7. Choose Add custom configuration for each configuration you want to add. 8. Choose Save. To learn more, see Using Apache Airflow configuration options on Amazon MWAA. Apache Airflow scheduler The Apache Airflow scheduler is a core component of Apache Airflow. An issue with the scheduler can prevent DAGs from being parsed and tasks from being scheduled. For more information about Apache Airflow scheduler tuning, see Fine-tuning your scheduler performance in the Apache Airflow documentation website. Parameters This section describes the configuration options available for the Apache Airflow scheduler and their use cases. Apache Airflow v2 Version v2 Configuration option Default Description Use case celery.sy nc_parallelism 1 The number of processes the Celery Executor uses to sync task state. You can use this option to prevent queue conflicts by limiting the processes the Celery Executor uses. By default, Apache Airflow scheduler 369 Amazon Managed Workflows for Apache Airflow User Guide Version Configuration option Default Description Use case a value is set to 1 to prevent errors in delivering task logs to CloudWatch Logs. Setting the value to 0 means using the maximum number of processes, but might cause errors when delivering task logs. Apache Airflow scheduler 370 Amazon Managed Workflows for Apache Airflow User Guide Version v2 Configuration option Default Description Use case scheduler .idle_sleep_time 1 The number of seconds to You can use this option to wait between free up CPU consecuti ve DAG file processing in the Scheduler "loop." usage on the Scheduler by increasing the time the Scheduler sleeps after it's finished retrieving DAG parsing results, finding and queuing tasks, and executing queued tasks in the Executor. Increasing this value consumes the number of Scheduler threads run on an environment in scheduler .parsing_ processes for Apache Airflow v2 and scheduler .max_thre ads for Apache Airflow v1. This may Apache Airflow scheduler 371 Amazon Managed Workflows for Apache Airflow User Guide Version Configuration option Default Description Use case v2 10 scheduler .max_dagr uns_to_cr eate_per_loop reduce the capacity of the Schedulers to parse DAGs, and increase the time it takes for DAGs to appear in the Web server. The maximum number of You can use this option to free DAGs to create up resources DagRuns for for schedulin per Scheduler g tasks by "loop." decreasing the maximum number of DagRuns for the Scheduler "loop." Apache Airflow scheduler 372 Amazon Managed Workflows for Apache Airflow User Guide Version v2 Configuration option scheduler .parsing_ processes Default |
amazon-mwaa-user-guide-107 | amazon-mwaa-user-guide.pdf | 107 | Amazon Managed Workflows for Apache Airflow User Guide Version Configuration option Default Description Use case v2 10 scheduler .max_dagr uns_to_cr eate_per_loop reduce the capacity of the Schedulers to parse DAGs, and increase the time it takes for DAGs to appear in the Web server. The maximum number of You can use this option to free DAGs to create up resources DagRuns for for schedulin per Scheduler g tasks by "loop." decreasing the maximum number of DagRuns for the Scheduler "loop." Apache Airflow scheduler 372 Amazon Managed Workflows for Apache Airflow User Guide Version v2 Configuration option scheduler .parsing_ processes Default Description Use case Set using the following The number of threads the You can use this option to free formula: (2 * Scheduler can number of vCPUs) - 1 by default. run in parallel to schedule DAGs. up resources by decreasin g the number of processes the Scheduler runs in parallel to parse DAGs. We recommend keeping this number low if DAG parsing is impacting task scheduling. You must specify a value that's less than the vCPU count on your environment. To learn more, see Limits. Limits This section describes the limits you should consider when adjusting the default parameters for the scheduler. scheduler.parsing_processes, scheduler.max_threads Two threads are allowed per vCPU for an environment class. At least one thread must be reserved for the scheduler for an environment class. If you notice a delay in tasks being scheduled, you may need to increase your environment class. For example, a large environment has a 4 vCPU Fargate container instance for its scheduler. This means that a maximum of 7 total Apache Airflow scheduler 373 Amazon Managed Workflows for Apache Airflow User Guide threads are available to use for other processes. That is, two threads multiplied four vCPUs, minus one for the scheduler itself. The value you specify in scheduler.max_threads and scheduler.parsing_processes must not exceed the number of threads available for an environment class (as shown, below: • mw1.small – Must not exceed 1 thread for other processes. The remaining thread is reserved for the Scheduler. • mw1.medium – Must not exceed 3 threads for other processes. The remaining thread is reserved for the Scheduler. • mw1.large – Must not exceed 7 threads for other processes. The remaining thread is reserved for the Scheduler. DAG folders The Apache Airflow Scheduler continuously scans the DAGs folder on your environment. Any contained plugins.zip files, or Python (.py) files containing “airflow” import statements. Any resulting Python DAG objects are then placed into a DagBag for that file to be processed by the Scheduler to determine what, if any, tasks need to be scheduled. Dag file parsing occurs regardless of whether the files contain any viable DAG objects. Parameters This section describes the configuration options available for the DAGs folder and their use cases. Apache Airflow v2 Version v2 Configuration option scheduler .dag_dir_ list_interval Default Description Use case 300 seconds The number of seconds the DAGs folder should be scanned for new files. You can use this option to free up resources by increasin g the number of seconds to parse the DAGs folder. We recommend DAG folders 374 Amazon Managed Workflows for Apache Airflow User Guide Version Configuration option Default Description Use case increasing this value if you're seeing long parsing times in total_par se_time metrics, which may be due to a large number of files in your DAGs folder. DAG folders 375 Amazon Managed Workflows for Apache Airflow User Guide Version v2 Configuration option scheduler .min_file _process_ interval Default Description Use case 30 seconds The number of seconds after which the scheduler parses a DAG You can use this option to free up resources by increasing the number of and updates seconds that to the DAG are the scheduler reflected. waits before parsing a DAG. For example, if you specify a value of 30, the DAG file is parsed after every 30 seconds. We recommend keeping this number high to decrease the CPU usage on your environme nt. DAG files As part of the Apache Airflow scheduler loop, individual DAG files are parsed to extract DAG Python objects. In Apache Airflow v2 and above, the scheduler parses a maximum of number of parsing processes at the same time. The number of seconds specified in scheduler.min_file_process_interval must pass before the same file is parsed again. DAG files 376 Amazon Managed Workflows for Apache Airflow User Guide Parameters This section describes the configuration options available for Apache Airflow DAG files and their use cases. Apache Airflow v2 Version v2 Configuration option core.dag_ file_proc essor_timeout Default Description Use case 50 seconds The number of seconds before You can use this option to free the DagFilePr ocessor times out processing a DAG file. up resources by increasin g the time it takes before the DagFilePr ocessor times out. |
amazon-mwaa-user-guide-108 | amazon-mwaa-user-guide.pdf | 108 | at the same time. The number of seconds specified in scheduler.min_file_process_interval must pass before the same file is parsed again. DAG files 376 Amazon Managed Workflows for Apache Airflow User Guide Parameters This section describes the configuration options available for Apache Airflow DAG files and their use cases. Apache Airflow v2 Version v2 Configuration option core.dag_ file_proc essor_timeout Default Description Use case 50 seconds The number of seconds before You can use this option to free the DagFilePr ocessor times out processing a DAG file. up resources by increasin g the time it takes before the DagFilePr ocessor times out. We recommend increasing this value if you're seeing timeouts in your DAG processing logs that result in no viable DAGs being loaded. You can use this option to free up resources by increasin g the time it takes before the Scheduler times out while importing a v2 30 seconds core.dagb ag_import _timeout The number of seconds before importing a Python file times out. DAG files 377 Amazon Managed Workflows for Apache Airflow User Guide Version Configuration option Default Description Use case Python file to extract the DAG objects. This option is processed as part of the Scheduler "loop," and must contain a value lower than the value specified in core.dag_ file_proc essor_tim eout . DAG files 378 Amazon Managed Workflows for Apache Airflow User Guide Version v2 Configuration option core.min_ serialize d_dag_upd ate_interval Default Description Use case 30 The minimum number of seconds after which serialize d DAGs in the You can use this option to free up resources by increasing the number of database are seconds after updated. which serialize d DAGs in the database are updated. We recommend increasing this value if you have a large number of DAGs, or complex DAGs. Increasing this value reduces the load on the Scheduler and the database as DAGs are serialized. DAG files 379 Amazon Managed Workflows for Apache Airflow User Guide Version v2 Configuration option core.min_ serialize d_dag_fet ch_interval Default Description Use case 10 The number of seconds a You can use this option to free serialized DAG is re-fetched from the database when already up resources by increasin g the number of seconds a loaded in the serialized DAG is DagBag. re-fetched. The value must be higher than the value specified in core.min_ serialize d_dag_upd ate_inter val to reduce database "write" rates. Increasing this value reduces the load on the Web server and the database as DAGs are serialized. Tasks The Apache Airflow scheduler and workers are both involved in queuing and de-queuing tasks. The scheduler takes parsed tasks ready to schedule from a None status to a Scheduled status. The executor, also running on the scheduler container in Fargate, queues those tasks and sets their status to Queued. When the workers have capacity, it takes the task from the queue and sets the Tasks 380 Amazon Managed Workflows for Apache Airflow User Guide status to Running, which subsequently changes its status to Success or Failed based on whether the task succeeds or fails. Parameters This section describes the configuration options available for Apache Airflow tasks and their use cases. The default configuration options that Amazon MWAA overrides are marked in red. Apache Airflow v2 Version v2 Configuration option Default Description Use case core.parallelism Dynamically set based on The maximum number of You can use this option to free (maxWorkers task instances * maxCelery that can have Workers) / schedulers * 1.5. a status of "Running." up resources by increasing the number of task instances that can run simultane ously. The value specified should be the number of available Workers "times" the Workers task density. We recommend changing this value only when you're seeing a large number of tasks stuck in the “Running” Tasks 381 Amazon Managed Workflows for Apache Airflow User Guide Version Configuration option Default Description Use case v2 core.dag_ concurrency 10000 The number of task instances allowed to run concurrently for each DAG. or “Queued” state. You can use this option to free up resources by increasing the number of task instances allowed to run concurrently. For example, if you have one hundred DAGs with ten parallel tasks, and you want all DAGs to run concurren tly, you can calculate the maximum parallelism as the number of available Workers "times" the Workers task density in celery.wo rker_conc urrency , divided by the number of DAGs (e.g. 100). Tasks 382 Amazon Managed Workflows for Apache Airflow User Guide Default Description Use case Version v2 Configuration option core.exec ute_tasks _new_pyth on_interpreter True Determine s whether Apache Airflow When set to True, Apache Airflow executes tasks recognizes by forking the changes you parent process, make to your or by creating a new Python process. plugins as a new Python process so created to execute tasks. Amazon MWAA overrides the |
amazon-mwaa-user-guide-109 | amazon-mwaa-user-guide.pdf | 109 | tly, you can calculate the maximum parallelism as the number of available Workers "times" the Workers task density in celery.wo rker_conc urrency , divided by the number of DAGs (e.g. 100). Tasks 382 Amazon Managed Workflows for Apache Airflow User Guide Default Description Use case Version v2 Configuration option core.exec ute_tasks _new_pyth on_interpreter True Determine s whether Apache Airflow When set to True, Apache Airflow executes tasks recognizes by forking the changes you parent process, make to your or by creating a new Python process. plugins as a new Python process so created to execute tasks. Amazon MWAA overrides the Airflow base Any value specified for this install for this option is option to scale ignored. Workers as part of its autoscali ng component. v2 N/A celery.wo rker_conc urrency Tasks 383 Amazon Managed Workflows for Apache Airflow User Guide Version v2 Configuration option Default Description Use case celery.wo rker_autoscale mw1.micro - 3,0 The task concurrency for You can use this option to free Workers. mw1.small - 5,0 mw1.medium - 10,0 mw1.large - 20,0 mw1.xlarge - 40,0 mw1.2xlarge - 80,0 up resources by reducing the maximum, minimum task concurrency of Workers. Workers accept up to the maximum concurrent tasks configure d, regardless of whether there are sufficien t resources to do so. If tasks are scheduled without sufficient resources , the tasks immediate ly fail. We recommend changing this value for resource- intensive tasks by reducing the values to be less than Tasks 384 Amazon Managed Workflows for Apache Airflow User Guide Version Configuration option Default Description Use case the defaults to allow more capacity per task. Managing Python dependencies in requirements.txt This topic describes how to install and manage Python dependencies in a requirements.txt file for an Amazon Managed Workflows for Apache Airflow environment. Contents • Testing DAGs using the Amazon MWAA CLI utility • Installing Python dependencies using PyPi.org Requirements File Format • Option one: Python dependencies from the Python Package Index • Option two: Python wheels (.whl) • Using the plugins.zip file on an Amazon S3 bucket • Using a WHL file hosted on a URL • Creating a WHL files from a DAG • Option three: Python dependencies hosted on a private PyPi/PEP-503 Compliant Repo • Enabling logs on the Amazon MWAA console • Viewing logs on the CloudWatch Logs console • Viewing errors in the Apache Airflow UI • Logging into Apache Airflow • Example requirements.txt scenarios Testing DAGs using the Amazon MWAA CLI utility • The command line interface (CLI) utility replicates an Amazon Managed Workflows for Apache Airflow environment locally. Managing Python dependencies 385 Amazon Managed Workflows for Apache Airflow User Guide • The CLI builds a Docker container image locally that’s similar to an Amazon MWAA production image. This allows you to run a local Apache Airflow environment to develop and test DAGs, custom plugins, and dependencies before deploying to Amazon MWAA. • To run the CLI, see the aws-mwaa-local-runner on GitHub. Installing Python dependencies using PyPi.org Requirements File Format The following section describes the different ways to install Python dependencies according to the PyPi.org Requirements File Format. Option one: Python dependencies from the Python Package Index The following section describes how to specify Python dependencies from the Python Package Index in a requirements.txt file. Apache Airflow v2 1. Test locally. Add additional libraries iteratively to find the right combination of packages and their versions, before creating a requirements.txt file. To run the Amazon MWAA CLI utility, see the aws-mwaa-local-runner on GitHub. 2. Review the Apache Airflow package extras. To view a list of the packages installed for Apache Airflow v2 on Amazon MWAA, see Amazon MWAA local runner requirements.txt on the GitHub website. 3. Add a constraints statement. Add the constraints file for your Apache Airflow v2 environment at the top of your requirements.txt file. Apache Airflow constraints files specify the provider versions available at the time of a Apache Airflow release. Beginning with Apache Airflow v2.7.2, your requirements file must include a -- constraint statement. If you do not provide a constraint, Amazon MWAA will specify one for you to ensure the packages listed in your requirements are compatible with the version of Apache Airflow you are using. In the following example, replace {environment-version} with your environment's version number, and {Python-version} with the version of Python that's compatible with your environment. Installing Python dependencies using PyPi.org Requirements File Format 386 Amazon Managed Workflows for Apache Airflow User Guide For information on the version of Python compatible with your Apache Airflow environment, see Apache Airflow Versions. --constraint "https://raw.githubusercontent.com/apache/airflow/ constraints-{Airflow-version}/constraints-{Python-version}.txt" If the constraints file determines that xyz==1.0 package is not compatible with other packages in your environment, pip3 install will fail in order to prevent incompatible libraries from being installed to your environment. If installation fails for any packages, you can view error |
amazon-mwaa-user-guide-110 | amazon-mwaa-user-guide.pdf | 110 | example, replace {environment-version} with your environment's version number, and {Python-version} with the version of Python that's compatible with your environment. Installing Python dependencies using PyPi.org Requirements File Format 386 Amazon Managed Workflows for Apache Airflow User Guide For information on the version of Python compatible with your Apache Airflow environment, see Apache Airflow Versions. --constraint "https://raw.githubusercontent.com/apache/airflow/ constraints-{Airflow-version}/constraints-{Python-version}.txt" If the constraints file determines that xyz==1.0 package is not compatible with other packages in your environment, pip3 install will fail in order to prevent incompatible libraries from being installed to your environment. If installation fails for any packages, you can view error logs for each Apache Airflow component (the scheduler, worker, and web server) in the corresponding log stream on CloudWatch Logs. For more information on log types, see the section called “Viewing Airflow logs”. 4. Apache Airflow packages. Add the package extras and the version (==). This helps to prevent packages of the same name, but different version, from being installed on your environment. apache-airflow[package-extra]==2.5.1 5. Python libraries. Add the package name and the version (==) in your requirements.txt file. This helps to prevent a future breaking update from PyPi.org from being automatically applied. library == version Example Boto3 and psycopg2-binary This example is provided for demonstration purposes. The boto and psycopg2-binary libraries are included with the Apache Airflow v2 base install and don't need to be specified in a requirements.txt file. boto3==1.17.54 boto==2.49.0 botocore==1.20.54 psycopg2-binary==2.8.6 Installing Python dependencies using PyPi.org Requirements File Format 387 Amazon Managed Workflows for Apache Airflow User Guide If a package is specified without a version, Amazon MWAA installs the latest version of the package from PyPi.org. This version may conflict with other packages in your requirements.txt. Apache Airflow v1 1. Test locally. Add additional libraries iteratively to find the right combination of packages and their versions, before creating a requirements.txt file. To run the Amazon MWAA CLI utility, see the aws-mwaa-local-runner on GitHub. 2. Review the Airflow package extras. Review the list of packages available for Apache Airflow v1.10.12 at https://raw.githubusercontent.com/apache/airflow/ constraints-1.10.12/constraints-3.7.txt. 3. Add the constraints file. Add the constraints file for Apache Airflow v1.10.12 to the top of your requirements.txt file. If the constraints file determines that xyz==1.0 package is not compatible with other packages on your environment, the pip3 install will fail to prevent incompatible libraries from being installed to your environment. --constraint "https://raw.githubusercontent.com/apache/airflow/ constraints-1.10.12/constraints-3.7.txt" 4. Apache Airflow v1.10.12 packages. Add the Airflow package extras and the Apache Airflow v1.10.12 version (==). This helps to prevent packages of the same name, but different version, from being installed on your environment. apache-airflow[package]==1.10.12 Example Secure Shell (SSH) The following example requirements.txt file installs SSH for Apache Airflow v1.10.12. apache-airflow[ssh]==1.10.12 5. Python libraries. Add the package name and the version (==) in your requirements.txt file. This helps to prevent a future breaking update from PyPi.org from being automatically applied. library == version Installing Python dependencies using PyPi.org Requirements File Format 388 Amazon Managed Workflows for Apache Airflow User Guide Example Boto3 The following example requirements.txt file installs the Boto3 library for Apache Airflow v1.10.12. boto3 == 1.17.4 If a package is specified without a version, Amazon MWAA installs the latest version of the package from PyPi.org. This version may conflict with other packages in your requirements.txt. Option two: Python wheels (.whl) A Python wheel is a package format designed to ship libraries with compiled artifacts. There are several benefits to wheel packages as a method to install dependencies in Amazon MWAA: • Faster installation – the WHL files are copied to the container as a single ZIP, and then installed locally, without having to download each one. • Fewer conflicts – You can determine version compatibility for your packages in advance. As a result, there is no need for pip to recursively work out compatible versions. • More resilience – With externally hosted libraries, downstream requirements can change, resulting in version incompatibility between containers on a Amazon MWAA environment. By not depending on an external source for dependencies, every container on has have the same libraries regardless of when the each container is instantiated. We recommend the following methods to install Python dependencies from a Python wheel archive (.whl) in your requirements.txt. Methods • Using the plugins.zip file on an Amazon S3 bucket • Using a WHL file hosted on a URL • Creating a WHL files from a DAG Installing Python dependencies using PyPi.org Requirements File Format 389 Amazon Managed Workflows for Apache Airflow User Guide Using the plugins.zip file on an Amazon S3 bucket The Apache Airflow scheduler, workers, and web server (for Apache Airflow v2.2.2 and later) look for custom plugins during startup on the AWS-managed Fargate container for your environment at /usr/local/airflow/plugins/*. This process begins prior to Amazon MWAA's pip3 install -r requirements.txt for Python dependencies and Apache Airflow service startup. A |
amazon-mwaa-user-guide-111 | amazon-mwaa-user-guide.pdf | 111 | plugins.zip file on an Amazon S3 bucket • Using a WHL file hosted on a URL • Creating a WHL files from a DAG Installing Python dependencies using PyPi.org Requirements File Format 389 Amazon Managed Workflows for Apache Airflow User Guide Using the plugins.zip file on an Amazon S3 bucket The Apache Airflow scheduler, workers, and web server (for Apache Airflow v2.2.2 and later) look for custom plugins during startup on the AWS-managed Fargate container for your environment at /usr/local/airflow/plugins/*. This process begins prior to Amazon MWAA's pip3 install -r requirements.txt for Python dependencies and Apache Airflow service startup. A plugins.zip file be used for any files that you don't want continuously changed during environment execution, or that you may not want to grant access to users that write DAGs. For example, Python library wheel files, certificate PEM files, and configuration YAML files. The following section describes how to install a wheel that's in the plugins.zip file on your Amazon S3 bucket. 1. Download the necessary WHL files You can use pip download with your existing requirements.txt on the Amazon MWAA local-runner or another Amazon Linux 2 container to resolve and download the necessary Python wheel files. $ pip3 download -r "$AIRFLOW_HOME/dags/requirements.txt" -d "$AIRFLOW_HOME/plugins" $ cd "$AIRFLOW_HOME/plugins" $ zip "$AIRFLOW_HOME/plugins.zip" * 2. Specify the path in your requirements.txt. Specify the plugins directory at the top of your requirements.txt using --find-links and instruct pip not to install from other sources using --no-index, as shown in the following --find-links /usr/local/airflow/plugins --no-index Example wheel in requirements.txt The following example assumes you've uploaded the wheel in a plugins.zip file at the root of your Amazon S3 bucket. For example: --find-links /usr/local/airflow/plugins --no-index numpy Installing Python dependencies using PyPi.org Requirements File Format 390 Amazon Managed Workflows for Apache Airflow User Guide Amazon MWAA fetches the numpy-1.20.1-cp37-cp37m-manylinux1_x86_64.whl wheel from the plugins folder and installs it on your environment. Using a WHL file hosted on a URL The following section describes how to install a wheel that's hosted on a URL. The URL must either be publicly accessible, or accessible from within the custom Amazon VPC you specified for your Amazon MWAA environment. • Provide a URL. Provide the URL to a wheel in your requirements.txt. Example wheel archive on a public URL The following example downloads a wheel from a public site. --find-links https://files.pythonhosted.org/packages/ --no-index Amazon MWAA fetches the wheel from the URL you specified and installs them on your environment. Note URLs are not accessible from private web servers installing requirements in Amazon MWAA v2.2.2 and later. Creating a WHL files from a DAG If you have a private web server using Apache Airflow v2.2.2 or later and you're unable to install requirements because your environment does not have access to external repositories, you can use the following DAG to take your existing Amazon MWAA requirements and package them on Amazon S3: from airflow import DAG from airflow.operators.bash_operator import BashOperator from airflow.utils.dates import days_ago S3_BUCKET = 'my-s3-bucket' Installing Python dependencies using PyPi.org Requirements File Format 391 Amazon Managed Workflows for Apache Airflow User Guide S3_KEY = 'backup/plugins_whl.zip' with DAG(dag_id="create_whl_file", schedule_interval=None, catchup=False, start_date=days_ago(1)) as dag: cli_command = BashOperator( task_id="bash_command", bash_command=f"mkdir /tmp/whls;pip3 download -r /usr/local/airflow/ requirements/requirements.txt -d /tmp/whls;zip -j /tmp/plugins.zip /tmp/whls/*;aws s3 cp /tmp/plugins.zip s3://{S3_BUCKET}/{S3_KEY}" ) After running the DAG, use this new file as your Amazon MWAA plugins.zip, optionally, packaged with other plugins. Then, update your requirements.txt preceded by --find- links /usr/local/airflow/plugins and --no-index without adding --constraint. This method allows you to use the same libraries offline. Option three: Python dependencies hosted on a private PyPi/PEP-503 Compliant Repo The following section describes how to install an Apache Airflow extra that's hosted on a private URL with authentication. 1. Add your user name and password as Apache Airflow configuration options. For example: • foo.user : YOUR_USER_NAME • foo.pass : YOUR_PASSWORD 2. Create your requirements.txt file. Substitute the placeholders in the following example with your private URL, and the username and password you've added as Apache Airflow configuration options. For example: --index-url https://${AIRFLOW__FOO__USER}:${AIRFLOW__FOO__PASS}@my.privatepypi.com 3. Add any additional libraries to your requirements.txt file. For example: --index-url https://${AIRFLOW__FOO__USER}:${AIRFLOW__FOO__PASS}@my.privatepypi.com my-private-package==1.2.3 Installing Python dependencies using PyPi.org Requirements File Format 392 Amazon Managed Workflows for Apache Airflow User Guide Enabling logs on the Amazon MWAA console The execution role for your Amazon MWAA environment needs permission to send logs to CloudWatch Logs. To update the permissions of an execution role, see Amazon MWAA execution role. You can enable Apache Airflow logs at the INFO, WARNING, ERROR, or CRITICAL level. When you choose a log level, Amazon MWAA sends logs for that level and all higher levels of severity. For example, if you enable logs at the INFO level, Amazon MWAA sends INFO logs and WARNING, ERROR, and CRITICAL log levels to CloudWatch Logs. We recommend enabling Apache Airflow logs at the INFO level |
amazon-mwaa-user-guide-112 | amazon-mwaa-user-guide.pdf | 112 | Amazon MWAA console The execution role for your Amazon MWAA environment needs permission to send logs to CloudWatch Logs. To update the permissions of an execution role, see Amazon MWAA execution role. You can enable Apache Airflow logs at the INFO, WARNING, ERROR, or CRITICAL level. When you choose a log level, Amazon MWAA sends logs for that level and all higher levels of severity. For example, if you enable logs at the INFO level, Amazon MWAA sends INFO logs and WARNING, ERROR, and CRITICAL log levels to CloudWatch Logs. We recommend enabling Apache Airflow logs at the INFO level for the Scheduler to view logs received for the requirements.txt. Viewing logs on the CloudWatch Logs console You can view Apache Airflow logs for the Scheduler scheduling your workflows and parsing your dags folder. The following steps describe how to open the log group for the Scheduler on the Amazon MWAA console, and view Apache Airflow logs on the CloudWatch Logs console. To view logs for a requirements.txt 1. Open the Environments page on the Amazon MWAA console. Enabling logs on the Amazon MWAA console 393 Amazon Managed Workflows for Apache Airflow 2. Choose an environment. User Guide 3. Choose the Airflow scheduler log group on the Monitoring pane. 4. Choose the requirements_install_ip log in Log streams. 5. You should see the list of packages that were installed on the environment at /usr/local/ airflow/.local/bin. For example: Collecting appdirs==1.4.4 (from -r /usr/local/airflow/.local/bin (line 1)) Downloading https://files.pythonhosted.org/ packages/3b/00/2344469e2084fb28kjdsfiuyweb47389789vxbmnbjhsdgf5463acd6cf5e3db69324/ appdirs-1.4.4-py2.py3-none-any.whl Collecting astroid==2.4.2 (from -r /usr/local/airflow/.local/bin (line 2)) 6. Review the list of packages and whether any of these encountered an error during installation. If something went wrong, you may see an error similar to the following: 2021-03-05T14:34:42.731-07:00 No matching distribution found for LibraryName==1.0.0 (from -r /usr/local/ airflow/.local/bin (line 4)) No matching distribution found for LibraryName==1.0.0 (from -r /usr/local/ airflow/.local/bin (line 4)) Viewing errors in the Apache Airflow UI You may also want to check your Apache Airflow UI to identify whether an error may be related to another issue. The most common error you may encounter with Apache Airflow on Amazon MWAA is: Broken DAG: No module named x If you see this error in your Apache Airflow UI, you're likely missing a required dependency in your requirements.txt file. Logging into Apache Airflow You need Apache Airflow UI access policy: AmazonMWAAWebServerAccess permissions for your AWS account in AWS Identity and Access Management (IAM) to view your Apache Airflow UI. Viewing errors in the Apache Airflow UI 394 Amazon Managed Workflows for Apache Airflow To access your Apache Airflow UI User Guide 1. Open the Environments page on the Amazon MWAA console. 2. Choose an environment. 3. Choose Open Airflow UI. Example requirements.txt scenarios You can mix and match different formats in your requirements.txt. The following example uses a combination of the different ways to install extras. Example Extras on PyPi.org and a public URL You need to use the --index-url option when specifying packages from PyPi.org, in addition to packages on a public URL, such as custom PEP 503 compliant repo URLs. aws-batch == 0.6 phoenix-letter >= 0.3 --index-url http://dist.repoze.org/zope2/2.10/simple zopelib Example requirements.txt scenarios 395 Amazon Managed Workflows for Apache Airflow User Guide Monitoring and metrics for Amazon Managed Workflows for Apache Airflow Monitoring is an important part of maintaining the reliability, availability, and performance of Amazon Managed Workflows for Apache Airflow and your AWS solution. We recommend collecting monitoring data from all parts of your AWS solution so that you can more easily debug a multi- point failure if one occurs. This topic describes what resources AWS provides for monitoring your Amazon MWAA environment and responding to potential events. Note Apache Airflow metrics and logging are subject to standard Amazon CloudWatch pricing. For more information about monitoring Apache Airflow, see Logging & Monitoring in the Apache Airflow documentation website. Sections • Monitoring overview on Amazon MWAA • Viewing audit logs in AWS CloudTrail • Viewing Airflow logs in Amazon CloudWatch • Monitoring dashboards and alarms on Amazon MWAA • Apache Airflow v2 environment metrics in CloudWatch • Container, queue, and database metrics for Amazon MWAA Monitoring overview on Amazon MWAA This page describes the AWS services used to monitor an Amazon Managed Workflows for Apache Airflow environment. Contents • Amazon CloudWatch overview • AWS CloudTrail overview Overview 396 Amazon Managed Workflows for Apache Airflow User Guide Amazon CloudWatch overview CloudWatch is a metrics repository for AWS services that allows you to retrieve statistics based on the metrics and dimensions published by a service. You can use these metrics to configure alarms, calculate statistics and then present the data in a dashboard that helps you assess the health of your environment in the Amazon CloudWatch console. Apache Airflow is already set-up to send StatsD metrics for an Amazon Managed Workflows for Apache Airflow |
amazon-mwaa-user-guide-113 | amazon-mwaa-user-guide.pdf | 113 | Workflows for Apache Airflow environment. Contents • Amazon CloudWatch overview • AWS CloudTrail overview Overview 396 Amazon Managed Workflows for Apache Airflow User Guide Amazon CloudWatch overview CloudWatch is a metrics repository for AWS services that allows you to retrieve statistics based on the metrics and dimensions published by a service. You can use these metrics to configure alarms, calculate statistics and then present the data in a dashboard that helps you assess the health of your environment in the Amazon CloudWatch console. Apache Airflow is already set-up to send StatsD metrics for an Amazon Managed Workflows for Apache Airflow environment to Amazon CloudWatch. To learn more, see What is Amazon CloudWatch?. AWS CloudTrail overview CloudTrail is an auditing service that provides a record of actions taken by a user, role, or an AWS service in Amazon MWAA. Using the information collected by CloudTrail, you can determine the request that was made to Amazon MWAA, the IP address from which the request was made, who made the request, when it was made, and additional details available in audit logs. To learn more, see What is AWS CloudTrail?. Viewing audit logs in AWS CloudTrail AWS CloudTrail is enabled on your AWS account when you create it. CloudTrail logs the activity taken by an IAM entity or an AWS service, such as Amazon Managed Workflows for Apache Airflow, which is recorded as a CloudTrail event. You can view, search, and download the past 90 days of event history in the CloudTrail console. CloudTrail captures all events on the Amazon MWAA console and all calls to Amazon MWAA APIs. It doesn't capture read-only actions, such as GetEnvironment, or the PublishMetrics action. This page describes how to use CloudTrail to monitor events for Amazon MWAA. Contents • Creating a trail in CloudTrail • Viewing events with CloudTrail Event History • Example trail for CreateEnvironment • What's next? Amazon CloudWatch overview 397 Amazon Managed Workflows for Apache Airflow User Guide Creating a trail in CloudTrail You need to create a trail to view an ongoing record of events in your AWS account, including events for Amazon MWAA. A trail enables CloudTrail to deliver log files to an Amazon S3 bucket. If you do not create a trail, you can still view available event history in the CloudTrail console. For example, using the information collected by CloudTrail, you can determine the request that was made to Amazon MWAA, the IP address from which the request was made, who made the request, when it was made, and additional details. To learn more, see the Creating a trail for your AWS account. Viewing events with CloudTrail Event History You can troubleshoot operational and security incidents over the past 90 days in the CloudTrail console by viewing event history. For example, you can view events related to the creation, modification, or deletion of resources (such as IAM users or other AWS resources) in your AWS account on a per-region basis. To learn more, see the Viewing Events with CloudTrail Event History. 1. Open the CloudTrail console. 2. Choose Event history. 3. Select the events you want to view, and then choose Compare event details. Example trail for CreateEnvironment A trail is a configuration that enables delivery of events as log files to an Amazon S3 bucket that you specify. CloudTrail log files contain one or more log entries. An event represents a single request from any source and includes information about the requested action, such as the date and time of the action, or request parameters. CloudTrail log files are not an ordered stack trace of the public API calls, and don't appear in any specific order. The following example is a log entry for the CreateEnvironment action that is denied due to lacking permissions. The values in AirflowConfigurationOptions have been redacted for privacy. { "eventVersion": "1.05", "userIdentity": { "type": "AssumedRole", "principalId": "00123456ABC7DEF8HIJK", Creating a trail in CloudTrail 398 Amazon Managed Workflows for Apache Airflow User Guide "arn": "arn:aws:sts::012345678901:assumed-role/root/myuser", "accountId": "012345678901", "accessKeyId": "", "sessionContext": { "sessionIssuer": { "type": "Role", "principalId": "00123456ABC7DEF8HIJK", "arn": "arn:aws:iam::012345678901:role/user", "accountId": "012345678901", "userName": "user" }, "webIdFederationData": {}, "attributes": { "mfaAuthenticated": "false", "creationDate": "2020-10-07T15:51:52Z" } } }, "eventTime": "2020-10-07T15:52:58Z", "eventSource": "airflow.amazonaws.com", "eventName": "CreateEnvironment", "awsRegion": "us-west-2", "sourceIPAddress": "205.251.233.178", "userAgent": "PostmanRuntime/7.26.5", "errorCode": "AccessDenied", "requestParameters": { "SourceBucketArn": "arn:aws:s3:::my-bucket", "ExecutionRoleArn": "arn:aws:iam::012345678901:role/AirflowTaskRole", "AirflowConfigurationOptions": "***", "DagS3Path": "sample_dag.py", "NetworkConfiguration": { "SecurityGroupIds": [ "sg-01234567890123456" ], "SubnetIds": [ "subnet-01234567890123456", "subnet-65432112345665431" ] }, "Name": "test-cloudtrail" }, "responseElements": { "message": "Access denied." }, Example trail for CreateEnvironment 399 Amazon Managed Workflows for Apache Airflow User Guide "requestID": "RequestID", "eventID": "EventID", "readOnly": false, "eventType": "AwsApiCall", "recipientAccountId": "012345678901" } What's next? • Learn how to configure other AWS services for the event data collected in CloudTrail logs in CloudTrail Supported Services and Integrations. • Learn how to be notified when CloudTrail publishes new log files to an Amazon S3 bucket |
amazon-mwaa-user-guide-114 | amazon-mwaa-user-guide.pdf | 114 | "userAgent": "PostmanRuntime/7.26.5", "errorCode": "AccessDenied", "requestParameters": { "SourceBucketArn": "arn:aws:s3:::my-bucket", "ExecutionRoleArn": "arn:aws:iam::012345678901:role/AirflowTaskRole", "AirflowConfigurationOptions": "***", "DagS3Path": "sample_dag.py", "NetworkConfiguration": { "SecurityGroupIds": [ "sg-01234567890123456" ], "SubnetIds": [ "subnet-01234567890123456", "subnet-65432112345665431" ] }, "Name": "test-cloudtrail" }, "responseElements": { "message": "Access denied." }, Example trail for CreateEnvironment 399 Amazon Managed Workflows for Apache Airflow User Guide "requestID": "RequestID", "eventID": "EventID", "readOnly": false, "eventType": "AwsApiCall", "recipientAccountId": "012345678901" } What's next? • Learn how to configure other AWS services for the event data collected in CloudTrail logs in CloudTrail Supported Services and Integrations. • Learn how to be notified when CloudTrail publishes new log files to an Amazon S3 bucket in Configuring Amazon SNS Notifications for CloudTrail. Viewing Airflow logs in Amazon CloudWatch Amazon MWAA can send Apache Airflow logs to Amazon CloudWatch. You can view logs for multiple environments from a single location to easily identify Apache Airflow task delays or workflow errors without the need for additional third-party tools. Apache Airflow logs need to be enabled on the Amazon Managed Workflows for Apache Airflow console to view Apache Airflow DAG processing, tasks, Web server, Worker logs in CloudWatch. Contents • Pricing • Before you begin • Log types • Enabling Apache Airflow logs • Viewing Apache Airflow logs • Example scheduler logs • What's next? Pricing • Standard CloudWatch Logs charges apply. For more information, see CloudWatch pricing. What's next? 400 Amazon Managed Workflows for Apache Airflow User Guide Before you begin • You must have a role that can view logs in CloudWatch. For more information, see Accessing an Amazon MWAA environment. Log types Amazon MWAA creates a log group for each Airflow logging option you enable, and pushes the logs to the CloudWatch Logs groups associated with an environment. Log groups are named in the following format: YourEnvironmentName-LogType. For example, if your environment's named Airflow-v202-Public, Apache Airflow task logs are sent to Airflow-v202-Public-Task. Log type Description YourEnvironmentName- DAGProces sing The logs of the DAG processor manager (the part of the scheduler that processes DAG files). YourEnvironmentName- Scheduler The logs the Airflow scheduler generates. YourEnvironmentName- Task The task logs a DAG generates. YourEnvironmentName- WebServer The logs the Airflow web interface generates. YourEnvironmentName- Worker The logs generated as part of workflow and DAG execution. Enabling Apache Airflow logs You can enable Apache Airflow logs at the INFO, WARNING, ERROR, or CRITICAL level. When you choose a log level, Amazon MWAA sends logs for that level and all higher levels of severity. For example, if you enable logs at the INFO level, Amazon MWAA sends INFO logs and WARNING, ERROR, and CRITICAL log levels to CloudWatch Logs. 1. Open the Environments page on the Amazon MWAA console. 2. Choose an environment. 3. Choose Edit. 4. Choose Next. Before you begin 401 Amazon Managed Workflows for Apache Airflow User Guide 5. Choose one or more of the following logging options: a. Choose the Airflow scheduler log group on the Monitoring pane. b. Choose the Airflow web server log group on the Monitoring pane. c. Choose the Airflow worker log group on the Monitoring pane. d. Choose the Airflow DAG processing log group on the Monitoring pane. e. f. Choose the Airflow task log group on the Monitoring pane. Choose the logging level in Log level. 6. Choose Next. 7. Choose Save. Viewing Apache Airflow logs The following section describes how to view Apache Airflow logs in the CloudWatch console. 1. Open the Environments page on the Amazon MWAA console. 2. Choose an environment. 3. Choose a log group in the Monitoring pane. 4. Choose a log in Log stream. Example scheduler logs You can view Apache Airflow logs for the Scheduler scheduling your workflows and parsing your dags folder. The following steps describe how to open the log group for the Scheduler on the Amazon MWAA console, and view Apache Airflow logs on the CloudWatch Logs console. To view logs for a requirements.txt 1. Open the Environments page on the Amazon MWAA console. 2. Choose an environment. 3. Choose the Airflow scheduler log group on the Monitoring pane. 4. Choose the requirements_install_ip log in Log streams. 5. You should see the list of packages that were installed on the environment at /usr/local/ airflow/.local/bin. For example: Viewing Apache Airflow logs 402 Amazon Managed Workflows for Apache Airflow User Guide Collecting appdirs==1.4.4 (from -r /usr/local/airflow/.local/bin (line 1)) Downloading https://files.pythonhosted.org/ packages/3b/00/2344469e2084fb28kjdsfiuyweb47389789vxbmnbjhsdgf5463acd6cf5e3db69324/ appdirs-1.4.4-py2.py3-none-any.whl Collecting astroid==2.4.2 (from -r /usr/local/airflow/.local/bin (line 2)) 6. Review the list of packages and whether any of these encountered an error during installation. If something went wrong, you may see an error similar to the following: 2021-03-05T14:34:42.731-07:00 No matching distribution found for LibraryName==1.0.0 (from -r /usr/local/ airflow/.local/bin (line 4)) No matching distribution found for LibraryName==1.0.0 (from -r /usr/local/ airflow/.local/bin (line 4)) What's next? • Learn how to configure a CloudWatch alarm in Using Amazon CloudWatch alarms. • Learn how to create |
amazon-mwaa-user-guide-115 | amazon-mwaa-user-guide.pdf | 115 | Amazon Managed Workflows for Apache Airflow User Guide Collecting appdirs==1.4.4 (from -r /usr/local/airflow/.local/bin (line 1)) Downloading https://files.pythonhosted.org/ packages/3b/00/2344469e2084fb28kjdsfiuyweb47389789vxbmnbjhsdgf5463acd6cf5e3db69324/ appdirs-1.4.4-py2.py3-none-any.whl Collecting astroid==2.4.2 (from -r /usr/local/airflow/.local/bin (line 2)) 6. Review the list of packages and whether any of these encountered an error during installation. If something went wrong, you may see an error similar to the following: 2021-03-05T14:34:42.731-07:00 No matching distribution found for LibraryName==1.0.0 (from -r /usr/local/ airflow/.local/bin (line 4)) No matching distribution found for LibraryName==1.0.0 (from -r /usr/local/ airflow/.local/bin (line 4)) What's next? • Learn how to configure a CloudWatch alarm in Using Amazon CloudWatch alarms. • Learn how to create a CloudWatch dashboard in Using CloudWatch dashboards. Monitoring dashboards and alarms on Amazon MWAA You can create a custom dashboard in Amazon CloudWatch and add alarms for a particular metric to monitor the health status of an Amazon Managed Workflows for Apache Airflow environment. When an alarm is on a dashboard, it turns red when it is in the ALARM state, making it easier for you to monitor the health of an Amazon MWAA environment proactively. Apache Airflow exposes metrics for a number of processes, including the number of DAG processes, DAG bag size, currently running tasks, task failures, and successes. When you create an environment, Airflow is configured to automatically send metrics for an Amazon MWAA environment to CloudWatch. This page describes how to create a health status dashboard for the Airflow metrics in CloudWatch for an Amazon MWAA environment. Contents • Metrics • Alarm states overview What's next? 403 Amazon Managed Workflows for Apache Airflow User Guide • Example custom dashboards and alarms • About these metrics • About the dashboard • Using AWS tutorials • Using AWS CloudFormation • Deleting metrics and dashboards • What's next? Metrics You can create a custom dashboard and alarm for any of the metrics available for your Apache Airflow version. Each metric corresponds to an Apache Airflow key performance indicator (KPI). To view a list of metrics, see: • Apache Airflow v2 environment metrics in CloudWatch Alarm states overview A metric alarm has the following possible states: • OK – The metric or expression is within the defined threshold. • ALARM – The metric or expression is outside of the defined threshold. • INSUFFICIENT_DATA – The alarm has just started, the metric is not available, or not enough data is available for the metric to determine the alarm state. Example custom dashboards and alarms You can build a custom monitoring dashboard that displays charts of selected metrics for your Amazon MWAA environment. About these metrics The following list describes each of the metrics created in the custom dashboard by the tutorial and template definitions in this section. Metrics 404 Amazon Managed Workflows for Apache Airflow User Guide • QueuedTasks - The number of tasks with queued state. Corresponds to the executor.queued_tasks Apache Airflow metric. • TasksPending - The number of tasks pending in executor. Corresponds to the scheduler.tasks.pending Apache Airflow metric. Note Does not apply to Apache Airflow v2.2 and above. • RunningTasks - The number of tasks running in executor. Corresponds to the executor.running_tasks Apache Airflow metric. • SchedulerHeartbeat - The number of check-ins Apache Airflow performs on the scheduler job. Corresponds to the scheduler_heartbeat Apache Airflow metrics. • TotalParseTime - The number of seconds taken to scan and import all DAG files once. Corresponds to the dag_processing.total_parse_time Apache Airflow metric. About the dashboard The following image shows the monitoring dashboard created by the tutorial and template definition in this section. Example custom dashboards and alarms 405 Amazon Managed Workflows for Apache Airflow User Guide Using AWS tutorials You can use the following AWS tutorial to automatically create a health status dashboard for any Amazon MWAA environments that are currently deployed. It also creates CloudWatch alarms for unhealthy workers and scheduler heartbeat failures across all Amazon MWAA environments. • CloudWatch Dashboard Automation for Amazon MWAA Example custom dashboards and alarms 406 Amazon Managed Workflows for Apache Airflow User Guide Using AWS CloudFormation You can use the AWS CloudFormation template definition in this section to create a monitoring dashboard in CloudWatch, then add alarms on the CloudWatch console to receive notifications when a metric surpasses a particular threshold. To create the stack using this template definition, see Creating a stack on the AWS CloudFormation console. To add an alarm to the dashboard, see Using alarms. AWSTemplateFormatVersion: "2010-09-09" Description: Creates MWAA Cloudwatch Dashboard Parameters: DashboardName: Description: Enter the name of the CloudWatch Dashboard Type: String EnvironmentName: Description: Enter the name of the MWAA Environment Type: String Resources: BasicDashboard: Type: AWS::CloudWatch::Dashboard Properties: DashboardName: !Ref DashboardName DashboardBody: Fn::Sub: '{ "widgets": [ { "type": "metric", "x": 0, "y": 0, "width": 12, "height": 6, "properties": { "view": "timeSeries", "stacked": true, "metrics": [ [ "AmazonMWAA", "QueuedTasks", "Function", "Executor", "Environment", "${EnvironmentName}" ] Example custom dashboards and alarms 407 |
amazon-mwaa-user-guide-116 | amazon-mwaa-user-guide.pdf | 116 | stack using this template definition, see Creating a stack on the AWS CloudFormation console. To add an alarm to the dashboard, see Using alarms. AWSTemplateFormatVersion: "2010-09-09" Description: Creates MWAA Cloudwatch Dashboard Parameters: DashboardName: Description: Enter the name of the CloudWatch Dashboard Type: String EnvironmentName: Description: Enter the name of the MWAA Environment Type: String Resources: BasicDashboard: Type: AWS::CloudWatch::Dashboard Properties: DashboardName: !Ref DashboardName DashboardBody: Fn::Sub: '{ "widgets": [ { "type": "metric", "x": 0, "y": 0, "width": 12, "height": 6, "properties": { "view": "timeSeries", "stacked": true, "metrics": [ [ "AmazonMWAA", "QueuedTasks", "Function", "Executor", "Environment", "${EnvironmentName}" ] Example custom dashboards and alarms 407 Amazon Managed Workflows for Apache Airflow ], "region": "${AWS::Region}", "title": "QueuedTasks ${EnvironmentName}", "period": 300 User Guide } }, { "type": "metric", "x": 0, "y": 6, "width": 12, "height": 6, "properties": { "view": "timeSeries", "stacked": true, "metrics": [ [ "AmazonMWAA", "RunningTasks", "Function", "Executor", "Environment", "${EnvironmentName}" ] ], "region": "${AWS::Region}", "title": "RunningTasks ${EnvironmentName}", "period": 300 } }, { "type": "metric", "x": 12, "y": 6, "width": 12, "height": 6, "properties": { "view": "timeSeries", "stacked": true, "metrics": [ [ "AmazonMWAA", "SchedulerHeartbeat", "Function", Example custom dashboards and alarms 408 Amazon Managed Workflows for Apache Airflow User Guide "Scheduler", "Environment", "${EnvironmentName}" ] ], "region": "${AWS::Region}", "title": "SchedulerHeartbeat ${EnvironmentName}", "period": 300 } }, { "type": "metric", "x": 12, "y": 0, "width": 12, "height": 6, "properties": { "view": "timeSeries", "stacked": true, "metrics": [ [ "AmazonMWAA", "TasksPending", "Function", "Scheduler", "Environment", "${EnvironmentName}" ] ], "region": "${AWS::Region}", "title": "TasksPending ${EnvironmentName}", "period": 300 } }, { "type": "metric", "x": 0, "y": 12, "width": 24, "height": 6, "properties": { "view": "timeSeries", "stacked": true, "region": "${AWS::Region}", Example custom dashboards and alarms 409 Amazon Managed Workflows for Apache Airflow User Guide "metrics": [ [ "AmazonMWAA", "TotalParseTime", "Function", "DAG Processing", "Environment", "${EnvironmentName}" ] ], "title": "TotalParseTime ${EnvironmentName}", "period": 300 } } ] }' Deleting metrics and dashboards If you delete an Amazon MWAA environment, the corresponding dashboard is also deleted. CloudWatch metrics are stored for fifteen (15) months and can not be deleted. The CloudWatch console limits the search of metrics to two (2) weeks after a metric is last ingested to ensure that the most up to date instances are shown for your Amazon MWAA environment. To learn more, see Amazon CloudWatch FAQs. What's next? • Learn how to create a DAG that queries the Amazon Aurora PostgreSQL metadata database for your environment and publishes custom metrics to CloudWatch in Using a DAG to write custom metrics in CloudWatch. Apache Airflow v2 environment metrics in CloudWatch Apache Airflow v2 is already set-up to collect and send StatsD metrics for an Amazon Managed Workflows for Apache Airflow environment to Amazon CloudWatch. The complete list of metrics Apache Airflow sends is available on the Metrics page in the Apache Airflow reference guide. This page describes the Apache Airflow metrics available in CloudWatch, and how to access metrics in the CloudWatch console. Contents Deleting metrics and dashboards 410 Amazon Managed Workflows for Apache Airflow User Guide • Terms • Dimensions • Accessing metrics in the CloudWatch console • Apache Airflow metrics available in CloudWatch • Apache Airflow Counters • Apache Airflow Gauges • Apache Airflow Timers • Choosing which metrics are reported • What's next? Terms Namespace A namespace is a container for the CloudWatch metrics of an AWS service. For Amazon MWAA, the namespace is AmazonMWAA. CloudWatch metrics A CloudWatch metric represents a time-ordered set of data points that are specific to CloudWatch. Apache Airflow metrics The Metrics specific to Apache Airflow. Dimension A dimension is a name/value pair that is part of the identity of a metric. Unit A statistic has a unit of measure. For Amazon MWAA, units include Count, Seconds, and Milliseconds. For Amazon MWAA, units are set based on the units in the original Airflow metrics. Dimensions This section describes the CloudWatch Dimensions grouping for Apache Airflow metrics in CloudWatch. Terms 411 Amazon Managed Workflows for Apache Airflow User Guide Dimension Description DAG DAG Filename Function Job Operator Pool Task HostName Indicates a specific Apache Airflow DAG name. Indicates a specific Apache Airflow DAG file name. This dimension is used to improve the grouping of metrics in CloudWatch. Indicates an Apache Airflow Job run by the Scheduler. Always has a value of Job. Indicates a specific Apache Airflow operator. Indicates a specific Apache Airflow worker pool. Indicates a specific Apache Airflow task. Indicates the hostname for a specific running Apache Airflow process. Accessing metrics in the CloudWatch console This section describes how to access performance metrics in CloudWatch for a specific DAG. Accessing metrics in the CloudWatch console 412 Amazon Managed Workflows for Apache Airflow User Guide To view performance metrics for a dimension 1. Open the Metrics page on the CloudWatch console. 2. Use the AWS Region selector to select your region. 3. Choose the AmazonMWAA namespace. 4. In the All metrics |
amazon-mwaa-user-guide-117 | amazon-mwaa-user-guide.pdf | 117 | Indicates a specific Apache Airflow operator. Indicates a specific Apache Airflow worker pool. Indicates a specific Apache Airflow task. Indicates the hostname for a specific running Apache Airflow process. Accessing metrics in the CloudWatch console This section describes how to access performance metrics in CloudWatch for a specific DAG. Accessing metrics in the CloudWatch console 412 Amazon Managed Workflows for Apache Airflow User Guide To view performance metrics for a dimension 1. Open the Metrics page on the CloudWatch console. 2. Use the AWS Region selector to select your region. 3. Choose the AmazonMWAA namespace. 4. In the All metrics tab, select a dimension. For example, DAG, Environment. 5. Choose a CloudWatch metric for a dimension. For example, TaskInstanceSuccesses or TaskInstanceDuration. Choose Graph all search results. 6. Choose the Graphed metrics tab to view performance statistics for Apache Airflow metrics, such as DAG, Environment, Task. Apache Airflow metrics available in CloudWatch This section describes the Apache Airflow metrics and dimensions sent to CloudWatch. Apache Airflow Counters The Apache Airflow metrics in this section contain data about Apache Airflow Counters. CloudWatch metric Apache Airflow metric Unit Dimension SLAMissed sla_missed Count Function, Scheduler Note Available for Apache Airflow v2.4.3 and above. FailedSLACallback Count Function, Scheduler sla_callb ack_notif ication_f ailure Apache Airflow metrics available in CloudWatch 413 Amazon Managed Workflows for Apache Airflow User Guide CloudWatch metric Apache Airflow metric Unit Dimension Note Available for Apache Airflow v2.4.3 and above. Updates dataset.u pdates Count Function, Scheduler Note Available for Apache Airflow v2.6.3 and above. Orphaned Note Available for Apache Airflow v2.6.3 and above. FailedCeleryTaskExecution Note Available for Apache Airflow v2.4.3 and above. dataset.o rphaned Count Function, Scheduler Count celery.ex ecute_com mand.failure Function, Celery Apache Airflow metrics available in CloudWatch 414 Amazon Managed Workflows for Apache Airflow User Guide CloudWatch metric FilePathQueueUpdateCount Note Available for Apache Airflow v2.6.3 and above. CriticalSectionBusy Apache Airflow metric dag_proce ssing.fil e_path_qu eue_updat e_count scheduler .critical _section_ busy Unit Dimension Count Function, Scheduler Count Function, Scheduler DagBagSize dagbag_size Count Function, DAG Processing DagCallbackExceptions FailedSLAEmailAttempts TaskInstanceFinished dag.callb ack_excep tions sla_email _notifica tion_failure ti.finish. {dag_id}. {task_id}. {state} Count DAG, All Count Count Function, Scheduler DAG, {dag_id} Task, {task_id} State, {state} Apache Airflow metrics available in CloudWatch 415 Amazon Managed Workflows for Apache Airflow User Guide CloudWatch metric JobEnd JobHeartbeatFailure JobStart ManagerStalls OperatorFailures OperatorSuccesses OtherCallbackCount Note Available in Apache Airflow v2.6.3 and above. Unit Dimension Count Count Count Count Count Count Count Job, {job_name} Job, {job_name} Job, {job_name} Function, DAG Processing Operator, {operator _name} Operator, {operator _name} Function, Scheduler Apache Airflow metric {job_name }_end {job_name }_heartbe at_failure {job_name }_start dag_proce ssing.man ager_stalls operator_ failures_ {operator _name} operator_ successes _{operato r_name} dag_proce ssing.oth er_callba ck_count Apache Airflow metrics available in CloudWatch 416 Amazon Managed Workflows for Apache Airflow User Guide CloudWatch metric Processes SchedulerHeartbeat StartedTaskInstances SlaCallbackCount Unit Dimension Count Count Function, DAG Processing Function, Scheduler Count DAG, All Count Task, All Function, Scheduler Apache Airflow metric dag_proce ssing.pro cesses scheduler _heartbeat ti.start. {dag_id}. {task_id} dag_proce ssing.sla _callback _count Note Available for Apache Airflow v2.6.3 and above. TasksKilledExternally Count Function, Scheduler scheduler .tasks.ki lled_exte rnally Apache Airflow metrics available in CloudWatch 417 Amazon Managed Workflows for Apache Airflow User Guide CloudWatch metric TaskTimeoutError TaskInstanceCreate dUsingOperator Apache Airflow metric celery.ta sk_timeou t_error task_inst ance_crea ted-{oper ator_name} Unit Dimension Count Count Function, Celery Operator, {operator _name} TaskInstancePreviouslySucce eded previousl y_succeeded Count DAG, All Task, All TaskInstanceFailures ti_failures Count DAG, All Task, All TaskInstanceSuccesses ti_successes Count DAG, All TaskRemovedFromDAG TaskRestoredToDAG Count Count task_remo ved_from_ dag.{dag_id} task_rest ored_to_dag. {dag_id} Task, All DAG, {dag_id} DAG, {dag_id} Apache Airflow metrics available in CloudWatch 418 Amazon Managed Workflows for Apache Airflow User Guide CloudWatch metric TriggersSucceeded Apache Airflow metric triggers. succeeded Unit Dimension Count Function, Trigger Note Available for Apache Airflow v2.7.2 and above. TriggersFailed triggers. failed Count Function, Trigger Note Available for Apache Airflow v2.7.2 and above. TriggersBlockedMainThread Note Available for Apache Airflow v2.7.2 and above. triggers. blocked_m ain_thread Count Function, Trigger TriggerHeartbeat triggerer _heartbeat Count Function, Triggerer Note Available for Apache Airflow v2.8.1 and above. Apache Airflow metrics available in CloudWatch 419 Amazon Managed Workflows for Apache Airflow User Guide CloudWatch metric TaskInstanceCreate dUsingOperator Apache Airflow metric airflow.t ask_insta nce_creat Unit Dimension Count Operator, {operator _name} ed_{operator _name} Note Available for Apache Airflow v2.7.2 and above. ZombiesKilled zombies_k illed Count DAG, All Task, All Apache Airflow Gauges The Apache Airflow metrics in this section contain data about Apache Airflow Gauges. CloudWatch metric DAGFileRe freshError ImportErrors Apache Airflow metric Unit Dimension dag_file_refresh_error Count dag_processing.imp ort_errors Count Function, DAG Processing Function, DAG Processing Apache Airflow metrics available in CloudWatch 420 Amazon Managed Workflows for Apache Airflow User Guide CloudWatch metric Apache Airflow metric Unit Dimension Exception Failures smart_sensor_opera tor.exception_failures Count ExecutedTasks smart_sensor_opera tor.executed_tasks Count InfraFailures smart_sensor_opera tor.infra_failures Count LoadedTasks smart_sensor_opera tor.loaded_tasks Count TotalPars eTime Triggered DagRuns dag_processing.tot al_parse_time dataset.triggered_ dagruns Seconds |
amazon-mwaa-user-guide-118 | amazon-mwaa-user-guide.pdf | 118 | Available for Apache Airflow v2.7.2 and above. ZombiesKilled zombies_k illed Count DAG, All Task, All Apache Airflow Gauges The Apache Airflow metrics in this section contain data about Apache Airflow Gauges. CloudWatch metric DAGFileRe freshError ImportErrors Apache Airflow metric Unit Dimension dag_file_refresh_error Count dag_processing.imp ort_errors Count Function, DAG Processing Function, DAG Processing Apache Airflow metrics available in CloudWatch 420 Amazon Managed Workflows for Apache Airflow User Guide CloudWatch metric Apache Airflow metric Unit Dimension Exception Failures smart_sensor_opera tor.exception_failures Count ExecutedTasks smart_sensor_opera tor.executed_tasks Count InfraFailures smart_sensor_opera tor.infra_failures Count LoadedTasks smart_sensor_opera tor.loaded_tasks Count TotalPars eTime Triggered DagRuns dag_processing.tot al_parse_time dataset.triggered_ dagruns Seconds Count Function, Smart Sensor Operator Function, Smart Sensor Operator Function, Smart Sensor Operator Function, Smart Sensor Operator Function, DAG Processing Function, Scheduler Note Available in Apache Airflow v2.6.3 and above. Apache Airflow metrics available in CloudWatch 421 Amazon Managed Workflows for Apache Airflow User Guide Apache Airflow metric Unit Dimension triggers. Count running.{hostname} Function, Trigger HostName, {hostname} CloudWatch metric TriggersR unning Note Available in Apache Airflow v2.7.2 and above. PoolDefer redSlots pool.deferred_slot Count s.{pool_name} Pool, {pool_name} Note Available in Apache Airflow v2.7.2 and above. DAGFilePr ocessingL astRunSec ondsAgo dag_processing.las t_run.seconds_ago. {dag_filename} Seconds DAG Filename, {dag_file name} OpenSlots executor.open_slots Count Function, Executor Apache Airflow metrics available in CloudWatch 422 Amazon Managed Workflows for Apache Airflow User Guide CloudWatch metric Apache Airflow metric Unit Dimension OrphanedT asksAdopted scheduler.orphaned _tasks.adopted OrphanedT asksCleared scheduler.orphaned _tasks.cleared PokedExce ptions smart_sensor_opera tor.poked_exception Count Count Count PokedSuccess smart_sensor_opera tor.poked_success Count PokedTasks smart_sensor_opera tor.poked_tasks Count PoolFailures pool.open_slots.{p ool_name} PoolStarv ingTasks pool.starving_tasks. {pool_name} PoolOpenSlots pool.open_slots.{p ool_name} PoolQueue dSlots pool.queued_slots. {pool_name} PoolRunni ngSlots pool.running_slots. {pool_name} Processor Timeouts dag_processing.pro cessor_timeouts Count Count Count Count Count Count Function, Scheduler Function, Scheduler Function, Smart Sensor Operator Function, Smart Sensor Operator Function, Smart Sensor Operator Pool, {pool_name} Pool, {pool_name} Pool, {pool_name} Pool, {pool_name} Pool, {pool_name} Function, DAG Processing Apache Airflow metrics available in CloudWatch 423 Amazon Managed Workflows for Apache Airflow User Guide CloudWatch metric Apache Airflow metric Unit Dimension QueuedTasks executor.queued_tasks Count RunningTasks executor.running_tasks Count TasksExec utable scheduler.tasks.ex ecutable TasksPending scheduler.tasks.pe nding Count Count Function, Executor Function, Executor Function, Scheduler Function, Scheduler Note Does not apply to Apache Airflow v2.2 and above. TasksRunning scheduler.tasks.running Count TasksStarving scheduler.tasks.starving Count TasksWith outDagRun scheduler.tasks.wi thout_dagrun Count Function, Scheduler Function, Scheduler Function, Scheduler Apache Airflow metrics available in CloudWatch 424 Amazon Managed Workflows for Apache Airflow User Guide CloudWatch metric Apache Airflow metric Unit Dimension DAGFilePr ocessingL dag_processing.las t_num_of_db_queries. Count astNumOfD {dag_filename} bQueries DAG Filename, {dag_file name} Note Available in Apache Airflow v2.10.1 and above. PoolSched uledSlots pool.scheduled_slots. {pool_name} Count Pool, {pool_name} Note Available in Apache Airflow v2.10.1 and above. Apache Airflow metrics available in CloudWatch 425 Amazon Managed Workflows for Apache Airflow User Guide Apache Airflow metric Unit Dimension cpu.usage.{dag_id}. {task_id} Percent DAG, {dag_id} Task, {task_id} CloudWatch metric TaskCpuUsage Note Available in Apache Airflow v2.10.1 and above. TaskMemor yUsage mem.usage.{dag_id}. {task_id} Percent DAG, {dag_id} Task, {task_id} Note Available in Apache Airflow v2.10.1 and above. Apache Airflow Timers The Apache Airflow metrics in this section contain data about Apache Airflow Timers. Apache Airflow metrics available in CloudWatch 426 Amazon Managed Workflows for Apache Airflow User Guide CloudWatch metric Apache Airflow metric Unit CollectDBDags collect_db_dags Milliseconds Dimension Function, DAG Processing CriticalS ectionDuration scheduler .critical_section_ Milliseconds Function, Scheduler CriticalS ectionQue ryDuration duration scheduler .critical_section_ query_duration Milliseconds Function, Scheduler Note Available for Apache Airflow v2.5.1 and above. DAGDepend encyCheck dagrun.de pendency-check. {dag_id} DAGDurati onFailed DAGDurati onSuccess dagrun.du ration.failed. {dag_id} dagrun.du ration.success. {dag_id} Milliseconds DAG, {dag_id} Milliseconds DAG, {dag_id} Milliseconds DAG, {dag_id} Apache Airflow metrics available in CloudWatch 427 Amazon Managed Workflows for Apache Airflow User Guide CloudWatch metric Apache Airflow metric Unit Dimension DAGFilePr ocessingL astDuration DAGSchedu leDelay FirstTask SchedulingDelay dag_proce ssing.las t_duration. {dag_filename} dagrun.sc hedule_delay. {dag_id} dagrun.{d ag_id}.fi rst_task_ schedulin g_delay Seconds DAG Filename, {dag_filename} Milliseconds DAG, {dag_id} Milliseconds DAG, {dag_id} Scheduler LoopDuration scheduler .schedule Milliseconds Function, Scheduler r_loop_duration Note Available for Apache Airflow v2.5.1 and above. TaskInsta nceDuration dag.{dag_id}. {task_id}.dura tion Milliseconds DAG, {dag_id} Task, {task_id} Apache Airflow metrics available in CloudWatch 428 Amazon Managed Workflows for Apache Airflow User Guide CloudWatch metric Apache Airflow metric Unit Dimension TaskInsta nceQueued Duration dag.{dag_id}.{task_id} Milliseconds DAG, {dag_id} .queued_d uration Task, {task_id} Note Available for Apache Airflow v2.7.2 and above. TaskInsta nceSchedu ledDuration dag.{dag_id}.{task_id} Milliseconds DAG, {dag_id} .schedule d_duration Task, {task_id} Note Available for Apache Airflow v2.7.2 and above. Choosing which metrics are reported You can choose which Apache Airflow metrics are emitted to CloudWatch, or blocked by Apache Airflow, using the following Amazon MWAA configuration options: Choosing which metrics are reported 429 Amazon Managed Workflows for Apache Airflow User Guide • metrics.metrics_allow_list — A list of comma-separated prefixes you can use to select which metrics are emitted to CloudWatch by your environment. Use this option if you want Apache Airflow to not send all available metrics and instead select a subset |
amazon-mwaa-user-guide-119 | amazon-mwaa-user-guide.pdf | 119 | ledDuration dag.{dag_id}.{task_id} Milliseconds DAG, {dag_id} .schedule d_duration Task, {task_id} Note Available for Apache Airflow v2.7.2 and above. Choosing which metrics are reported You can choose which Apache Airflow metrics are emitted to CloudWatch, or blocked by Apache Airflow, using the following Amazon MWAA configuration options: Choosing which metrics are reported 429 Amazon Managed Workflows for Apache Airflow User Guide • metrics.metrics_allow_list — A list of comma-separated prefixes you can use to select which metrics are emitted to CloudWatch by your environment. Use this option if you want Apache Airflow to not send all available metrics and instead select a subset of elements. For example, scheduler,executor,dagrun. • metrics.metrics_block_list — A list of comma-separated prefixes to filter out metrics that start with the elements of the list. For example, scheduler,executor,dagrun. If you configure both metrics.metrics_allow_list and metrics.metrics_block_list, Apache Airflow ignores metrics.metrics_block_list. If you configure metrics.metrics_block_list but not metrics.metrics_allow_list, Apache Airflow filters out the elements you specify in metrics.metrics_block_list. Note The metrics.metrics_allow_list and metrics.metrics_block_list configuration options only apply to Apache Airflow v2.6.3 and above. For previous version of Apache Airflow use metrics.statsd_allow_list and metrics.statsd_block_list instead. What's next? • Explore the Amazon MWAA API operation used to publish environment health metrics at PublishMetrics. Container, queue, and database metrics for Amazon MWAA In addition to Apache Airflow metrics, you can monitor the underlying components of your Amazon Managed Workflows for Apache Airflow environments using CloudWatch, which collects raw data and processes data into readable, near real-time metrics. With these environment metrics, you will have greater visibility into key performance indicators to help you appropriately size your environments and debug issues with your workflows. These metrics apply to all supported Apache Airflow versions on Amazon MWAA. Amazon MWAA will provide CPU and memory utilization for each Amazon Elastic Container Service (Amazon ECS) container and Amazon Aurora PostgreSQL instance, and Amazon Simple Queue What's next? 430 Amazon Managed Workflows for Apache Airflow User Guide Service (Amazon SQS) metrics for the number of messages and the age of the oldest message, Amazon Relational Database Service (Amazon RDS) metrics for database connections, disk queue depth, write operations, latency, and throughput, and Amazon RDS Proxy metrics. These metrics also include the number of base workers, additional workers, schedulers, and web servers. These statistics are kept for 15 months, so that you can access historical information and gain a better perspective on why a schedule is failing, and troubleshoot underlying issues. You can also set alarms that watch for certain thresholds, and send notifications or take actions when those thresholds are met. For more information, see the Amazon CloudWatch User Guide. Topics • Terms • Dimensions • Accessing metrics in the CloudWatch console • List of metrics Terms Namespace A namespace is a container for the CloudWatch metrics of an AWS service. For Amazon MWAA, the namespace is AWS/MWAA. CloudWatch metrics A CloudWatch metric represents a time-ordered set of data points that are specific to CloudWatch. Dimension A dimension is a name/value pair that is part of the identity of a metric. Unit A statistic has a unit of measure. For Amazon MWAA, units include Count. Dimensions This section describes the CloudWatch dimensions grouping for Amazon MWAA metrics in CloudWatch. Terms 431 Amazon Managed Workflows for Apache Airflow User Guide Dimension Cluster Queue Database Description Metrics for the minimum three Amazon ECS container that an Amazon MWAA environme nt uses to run Apache Airflow components: scheduler, worker, and web server. Metrics for the Amazon SQS queues that decouple the scheduler from workers. When workers read the messages, they are considere d in-flight and not available for other workers. Messages become available for other workers to read if they are not deleted before the 12 hours visibility timeout. Metrics the Aurora clusters used by Amazon MWAA. This includes metrics for the primary database instance and a read replica to support the read operations. Amazon MWAA publishes database metrics for both READER and WRITER instances. Accessing metrics in the CloudWatch console This section describes how to access your Amazon MWAA metrics in CloudWatch. To view performance metrics for a dimension 1. Open the Metrics page on the CloudWatch console. 2. Use the AWS Region selector to select your region. 3. Choose the AWS/MWAA namespace. 4. In the All metrics tab, choose a dimension. For example, Cluster. 5. Choose a CloudWatch metric for a dimension. For example, NumSchedulers or CPUUtilization. Then, choose Graph all search results. 6. Choose the Graphed metrics tab to view performance metrics. Accessing metrics 432 Amazon Managed Workflows for Apache Airflow User Guide List of metrics The following tables list the cluster, queue, and database service metrics for Amazon MWAA. To view descriptions for metrics directly emitted from Amazon ECS, Amazon SQS, or Amazon RDS, choose the respective documentation link. Topics • Cluster metrics • Database metrics • Queue metrics |
amazon-mwaa-user-guide-120 | amazon-mwaa-user-guide.pdf | 120 | namespace. 4. In the All metrics tab, choose a dimension. For example, Cluster. 5. Choose a CloudWatch metric for a dimension. For example, NumSchedulers or CPUUtilization. Then, choose Graph all search results. 6. Choose the Graphed metrics tab to view performance metrics. Accessing metrics 432 Amazon Managed Workflows for Apache Airflow User Guide List of metrics The following tables list the cluster, queue, and database service metrics for Amazon MWAA. To view descriptions for metrics directly emitted from Amazon ECS, Amazon SQS, or Amazon RDS, choose the respective documentation link. Topics • Cluster metrics • Database metrics • Queue metrics • Application Load Balancer metrics Cluster metrics The following metrics apply to each scheduler, base worker, additional worker, and web server. For more information and descriptions of each cluster metric, see Available metrics and dimensions in the Amazon ECS Developer Guide. Namespace AWS/MWAA AWS/MWAA Metric CPUUtilization MemoryUtilization Unit Percent Percent Evaluating the number of additional worker and web server containers You can use the component metrics provided under the Cluster dimension, as described in the following procedure, to assess how many additional workers, or web servers, an environment is using at a given point in time. You can do this by graphing either the CPUUtilization or the MemoryUtilization metric and setting the statistic type to Sample Count. The resulting value is the total number of RUNNING tasks for the AdditionalWorker component. Understanding the number of additional worker instances utilized by your environment can help you gauge how your environment scales and allow you to optimize the number of additional workers. List of metrics 433 Amazon Managed Workflows for Apache Airflow User Guide Workers To evaluate the number of additional workers using the AWS Management Console 1. Choose the AWS/MWAA namespace. 2. In the All metrics tab, choose the Cluster dimension. 3. Under the Cluster dimension, for the AdditionalWorker, choose either the CPUUtilization or the MemoryUtilization metric. 4. On the Graphed metrics tab, set Period to 1 Minute and Statistic to Sample Count. Web servers To evaluate the number of additional web servers using the AWS Management Console 1. Choose the AWS/MWAA namespace. 2. In the All metrics tab, choose the Cluster dimension. 3. Under the Cluster dimension, for the AdditionalWebservers, choose either the CPUUtilization or the MemoryUtilization metric. 4. On the Graphed metrics tab, set Period to 1 Minute and Statistic to Sample Count. For more information, see Service RUNNING task count in the Amazon Elastic Container Service Developer Guide. Database metrics The following metrics apply to each database instance associated with the Amazon MWAA environment. Namespace Metric AWS/MWAA AWS/MWAA AWS/MWAA AWS/MWAA List of metrics CPUUtilization DatabaseConnections DiskQueueDepth FreeableMemory Unit Percent Count Count Bytes 434 Amazon Managed Workflows for Apache Airflow User Guide Namespace AWS/MWAA Metric VolumeWriteIOPS AWS/MWAA WriteIOPS Unit Count per five minutes Count per second Seconds AWS/MWAA AWS/MWAA Queue metrics WriteLatency WriteThroughput Bytes per second For more information on units and descriptions for the following queue metrics, see Available CloudWatch metrics for Amazon SQS in the Amazon Simple Queue Service Developer Guide. Namespace AWS/MWAA AWS/MWAA AWS/MWAA Metric Unit ApproximateAgeOfOl Seconds destTask RunningTasks QueuedTasks Count Count Application Load Balancer metrics Application Load Balancer metrics apply to the web servers running in your environment. Amazon MWAA uses these metrics to for scaling your web servers based on the amount of traffic. For more information on units and descriptions for the following load balancer metrics, see CloudWatch metrics for your Application Load Balancer in the Application Load Balancers User Guide. Namespace AWS/MWAA List of metrics Metric ActiveConnectionCount Unit Count 435 Amazon Managed Workflows for Apache Airflow User Guide Security in Amazon Managed Workflows for Apache Airflow Cloud security at AWS is the highest priority. As an AWS customer, you benefit from a data center and network architecture that is built to meet the requirements of the most security-sensitive organizations. Security is a shared responsibility between AWS and you (the customer). The shared responsibility model describes this as security of the cloud and security in the cloud: • Security of the cloud – AWS is responsible for protecting the infrastructure that runs AWS services in the AWS Cloud. AWS also provides you with services that you can use securely. Third- party auditors regularly test and verify the effectiveness of our security as part of the AWS Compliance Programs. To learn about the compliance programs that apply to Amazon MWAA, see AWS Services in Scope by Compliance Program. • Security in the cloud – Your responsibility is determined by the AWS service that you use. You are also responsible for other factors including the sensitivity of your data, your company’s requirements, and applicable laws and regulations. This documentation helps you understand how to apply the shared responsibility model when using Amazon Managed Workflows for Apache Airflow. It shows you how to configure Amazon |
amazon-mwaa-user-guide-121 | amazon-mwaa-user-guide.pdf | 121 | regularly test and verify the effectiveness of our security as part of the AWS Compliance Programs. To learn about the compliance programs that apply to Amazon MWAA, see AWS Services in Scope by Compliance Program. • Security in the cloud – Your responsibility is determined by the AWS service that you use. You are also responsible for other factors including the sensitivity of your data, your company’s requirements, and applicable laws and regulations. This documentation helps you understand how to apply the shared responsibility model when using Amazon Managed Workflows for Apache Airflow. It shows you how to configure Amazon MWAA to meet your security and compliance objectives. You also learn how to use other AWS services that help you to monitor and secure your Amazon MWAA resources. In this section: • Data Protection in Amazon Managed Workflows for Apache Airflow • AWS Identity and Access Management • Compliance Validation for Amazon Managed Workflows for Apache Airflow • Resilience in Amazon Managed Workflows for Apache Airflow • Infrastructure Security in Amazon MWAA • Configuration and Vulnerability Analysis in Amazon MWAA • Security best practices on Amazon MWAA 436 Amazon Managed Workflows for Apache Airflow User Guide Data Protection in Amazon Managed Workflows for Apache Airflow The AWS shared responsibility model applies to data protection in Amazon Managed Workflows for Apache Airflow. As described in this model, AWS is responsible for protecting the global infrastructure that runs all of the AWS Cloud. You are responsible for maintaining control over your content that is hosted on this infrastructure. This content includes the security configuration and management tasks for the AWS services that you use. For more information about data privacy, see the Data Privacy FAQ. For information about data protection in Europe, see the AWS Shared Responsibility Model and GDPR blog post on the AWS Security Blog. For data protection purposes, we recommend that you protect AWS account credentials and set up individual user accounts with AWS Identity and Access Management (IAM). That way each user is given only the permissions necessary to fulfill their job duties. We also recommend that you secure your data in the following ways: • Use multi-factor authentication (MFA) with each account. • Use SSL/TLS to communicate with AWS resources. We recommend TLS 1.2 or later. • Set up API and user activity logging with AWS CloudTrail. • Use AWS encryption solutions, along with all default security controls within AWS services. • Use advanced managed security services such as Amazon Macie, which assists in discovering and securing personal data that is stored in Amazon S3. We strongly recommend that you never put confidential or sensitive information, such as your customers' email addresses, into tags or free-form fields such as a Name field. This includes when you work with Amazon MWAA or other AWS services using the console, API, AWS CLI, or AWS SDKs. Any data that you enter into tags or free-form fields used for names may be used for billing or diagnostic logs. If you provide a URL to an external server, we strongly recommend that you do not include credentials information in the URL to validate your request to that server. Encryption on Amazon MWAA The following topics describe how Amazon MWAA protects your data at rest, and in transit. Use this information to learn how Amazon MWAA integrates with AWS KMS to encrypt data at rest, and how data is encrypted using Transport Layer Security (TLS) protocol in transit. Topics Data Protection 437 Amazon Managed Workflows for Apache Airflow User Guide • Encryption at rest • Encryption in transit Encryption at rest On Amazon MWAA, data at rest is data that the service saves to persistent media. You can use an AWS owned key for data at rest encryption, or optionally provide a Customer managed key for additional encryption when you create an environment. If you choose to use a customer managed KMS key, it must be in the same account as the other AWS resources and services you are using with your environment. To use a customer managed KMS key, you must attach the required policy statement for CloudWatch access to your key policy. When you use a customer managed KMS key for your environment, Amazon MWAA attaches four grants on your behalf. For more information on the grants Amazon MWAA attaches to a customer managed KMS key, see Customer managed keys for data encryption. If you do not specify a customer managed KMS key, by default, Amazon MWAA uses an AWS owned KMS key for to encrypt and decrypt your data. We recommend using an AWS owned KMS key to manage data encryption on Amazon MWAA. Note You pay for the storage and use of AWS owned, or customer managed KMS keys on Amazon MWAA. For more information, see AWS KMS Pricing. |
amazon-mwaa-user-guide-122 | amazon-mwaa-user-guide.pdf | 122 | your environment, Amazon MWAA attaches four grants on your behalf. For more information on the grants Amazon MWAA attaches to a customer managed KMS key, see Customer managed keys for data encryption. If you do not specify a customer managed KMS key, by default, Amazon MWAA uses an AWS owned KMS key for to encrypt and decrypt your data. We recommend using an AWS owned KMS key to manage data encryption on Amazon MWAA. Note You pay for the storage and use of AWS owned, or customer managed KMS keys on Amazon MWAA. For more information, see AWS KMS Pricing. Encryption artifacts You specify the encryption artifacts used for at rest encryption by specifying an AWS owned key or Customer managed key when you create your Amazon MWAA environment. Amazon MWAA adds the grants needed to your specified key. Amazon S3 – Amazon S3 data is encrypted at the object-level using Server-Side Encryption (SSE). Amazon S3 encryption and decryption takes place on the Amazon S3 bucket where your DAG code and supporting files are stored. Objects are encrypted when they are uploaded to Amazon S3 and decrypted when they are downloaded to your Amazon MWAA environment. By default, if you are Encryption 438 Amazon Managed Workflows for Apache Airflow User Guide using a customer managed KMS key, Amazon MWAA uses it to read and decrypt the data on your Amazon S3 bucket. CloudWatch Logs – If you are using an AWS owned KMS key, Apache Airflow logs sent to CloudWatch Logs are encrypted using Server-Side Encryption (SSE) with CloudWatch Logs's AWS owned KMS key. If you are using a customer managed KMS key, you must add a key policy to your KMS key to allow CloudWatch Logs to use your key. Amazon SQS – Amazon MWAA creates one Amazon SQS queue for your environment. Amazon MWAA handles encrypting data passed to and from the queue using Server-Side Encryption (SSE) with either an AWS owned KMS key, or a customer managed KMS key that you specify. You must add Amazon SQS permissions to your execution role regardless of whether you are using an AWS owned or customer managed KMS key. Aurora PostgreSQL – Amazon MWAA creates one PostgreSQL cluster for your environment. Aurora PostgreSQL encrypts the content with either an AWS owned or customer managed KMS key using Server-Side Encryption (SSE). If you are using a customer managed KMS key, Amazon RDS adds at least two grants to the key: one for the cluster and one for the database instance. Amazon RDS might create additional grants if you choose to use your customer managed KMS key on multiple environments. For more information, see Data protection in Amazon RDS. Encryption in transit Data in transit is referred to as data that may be intercepted as it travels the network. Transport Layer Security (TLS) encrypts the Amazon MWAA objects in transit between your environment's Apache Airflow components and other AWS services that integrate with Amazon MWAA. such as Amazon S3. For more information about Amazon S3 encryption, see Protecting data using encryption. Using customer managed keys for encryption You can optionally provide a Customer managed key for data encryption on your environment. You must create the customer managed KMS key in the same Region as your Amazon MWAA environment instance and your Amazon S3 bucket where you store resources for your workflows. If the customer managed KMS key that you specify is in a different account from the one you use to configure an environment, you must specify the key using its ARN for cross-account access. For more information about creating keys, see Creating Keys in the AWS Key Management Service Developer Guide. Using customer managed keys 439 Amazon Managed Workflows for Apache Airflow User Guide What's supported AWS KMS feature Supported An AWS KMS key ID or ARN. Yes An AWS KMS key alias. An AWS KMS multi-region key. No No Using Grants for Encryption This topic describes the grants Amazon MWAA attaches to a customer managed KMS key on your behalf to encrypt and decrypt your data. How it works There are two resource-based access control mechanisms supported by AWS KMS for customer managed KMS key: a key policy and grant. A key policy is used when the permission is mostly static and used in synchronous service mode. A grant is used when more dynamic and granular permissions are required, such as when a service needs to define different access permissions for itself or other accounts. Amazon MWAA uses and attaches four grant policies to your customer managed KMS key. This is due to the granular permissions required for an environment to encrypt data at rest from CloudWatch Logs, Amazon SQS queue, Aurora PostgreSQL database database, Secrets Manager secrets, Amazon S3 bucket and DynamoDB tables. When you create an Amazon MWAA |
amazon-mwaa-user-guide-123 | amazon-mwaa-user-guide.pdf | 123 | A key policy is used when the permission is mostly static and used in synchronous service mode. A grant is used when more dynamic and granular permissions are required, such as when a service needs to define different access permissions for itself or other accounts. Amazon MWAA uses and attaches four grant policies to your customer managed KMS key. This is due to the granular permissions required for an environment to encrypt data at rest from CloudWatch Logs, Amazon SQS queue, Aurora PostgreSQL database database, Secrets Manager secrets, Amazon S3 bucket and DynamoDB tables. When you create an Amazon MWAA environment and specify a customer managed KMS key, Amazon MWAA attaches the grant policies to your customer managed KMS key. These policies allow Amazon MWAA in airflow.region}.amazonaws.com to use your customer managed KMS key to encrypt resources on your behalf that are owned by Amazon MWAA. Amazon MWAA creates, and attaches, additional grants to a specified KMS key on your behalf. This includes policies to retire a grant if you delete your environment, to use your customer managed KMS key for Client-Side Encryption (CSE), and for the AWS Fargate execution role that needs to access secrets protected by your customer managed key in Secrets Manager. Using customer managed keys 440 Amazon Managed Workflows for Apache Airflow User Guide Grant policies Amazon MWAA adds the following resource based policy grants on your behalf to a customer managed KMS key. These policies allow the grantee and the principal (Amazon MWAA) to perform actions defined in the policy. Grant 1: used to create data plane resources { "Name": "mwaa-grant-for-env-mgmt-role-environment name", "GranteePrincipal": "airflow.region.amazonaws.com", "RetiringPrincipal": "airflow.region.amazonaws.com", "Operations": [ "kms:Encrypt", "kms:Decrypt", "kms:ReEncrypt*", "kms:GenerateDataKey*", "kms:CreateGrant", "kms:DescribeKey", "kms:RetireGrant" ] } Grant 2: used for ControllerLambdaExecutionRole access { "Name": "mwaa-grant-for-lambda-exec-environment name", "GranteePrincipal": "airflow.region.amazonaws.com", "RetiringPrincipal": "airflow.region.amazonaws.com", "Operations": [ "kms:Encrypt", "kms:Decrypt", "kms:ReEncrypt*", "kms:GenerateDataKey*", "kms:DescribeKey", "kms:RetireGrant" ] } Using customer managed keys 441 Amazon Managed Workflows for Apache Airflow User Guide Grant 3: used for CfnManagementLambdaExecutionRole access { "Name": " mwaa-grant-for-cfn-mgmt-environment name", "GranteePrincipal": "airflow.region.amazonaws.com", "RetiringPrincipal": "airflow.region.amazonaws.com", "Operations": [ "kms:Encrypt", "kms:Decrypt", "kms:ReEncrypt*", "kms:GenerateDataKey*", "kms:DescribeKey" ] } Grant 4: used for Fargate execution role to access backend secrets { "Name": "mwaa-fargate-access-for-environment name", "GranteePrincipal": "airflow.region.amazonaws.com", "RetiringPrincipal": "airflow.region.amazonaws.com", "Operations": [ "kms:Encrypt", "kms:Decrypt", "kms:ReEncrypt*", "kms:GenerateDataKey*", "kms:DescribeKey", "kms:RetireGrant" ] } Attaching key policies to a customer managed key If you choose to use your own customer managed KMS key with Amazon MWAA, you must attach the following policy to the key to allow Amazon MWAA to use it to encrypt your data. If the customer managed KMS key you used for your Amazon MWAA environment is not already configured to work with CloudWatch, you must update the key policy to allow for encrypted CloudWatch Logs. For more information, see the Encrypt log data in CloudWatch using AWS Key Management Service service. Using customer managed keys 442 Amazon Managed Workflows for Apache Airflow User Guide The following example represents a key policy for CloudWatch Logs. Substitute the sample values provided for the region. { "Effect": "Allow", "Principal": { "Service": "logs.us-west-2.amazonaws.com" }, "Action": [ "kms:Encrypt*", "kms:Decrypt*", "kms:ReEncrypt*", "kms:GenerateDataKey*", "kms:Describe*" ], "Resource": "*", "Condition": { "ArnLike": { "kms:EncryptionContext:aws:logs:arn": "arn:aws:logs:us-west-2:*:*" } } } AWS Identity and Access Management AWS Identity and Access Management (IAM) is an AWS service that helps an administrator securely control access to AWS resources. IAM administrators control who can be authenticated (signed in) and authorized (have permissions) to use Amazon Managed Workflows for Apache Airflow resources. IAM is an AWS service that you can use with no additional charge. This topic provides a basic overview of how Amazon MWAA uses AWS Identity and Access Management (IAM). To learn about managing access to Amazon MWAA, see Managing access to an Amazon MWAA environment. Contents • Audience • Authenticating With Identities • Managing Access Using Policies • Allowing users to view their own permissions AWS Identity and Access Management 443 Amazon Managed Workflows for Apache Airflow User Guide • Troubleshooting Amazon Managed Workflows for Apache Airflow identity and access • How Amazon MWAA works with IAM Audience How you use AWS Identity and Access Management (IAM) differs, depending on the work that you do in Amazon MWAA. Service user – If you use the Amazon MWAA service to do your job, then your administrator provides you with the credentials and permissions that you need. As you use more Amazon MWAA features to do your work, you might need additional permissions. Understanding how access is managed can help you request the right permissions from your administrator. If you cannot access a feature in Amazon MWAA, see Troubleshooting Amazon Managed Workflows for Apache Airflow identity and access. Service administrator – If you're in charge of Amazon MWAA resources at your company, you probably have full access to Amazon MWAA. It's your job to determine which Amazon MWAA features and resources your service |
amazon-mwaa-user-guide-124 | amazon-mwaa-user-guide.pdf | 124 | job, then your administrator provides you with the credentials and permissions that you need. As you use more Amazon MWAA features to do your work, you might need additional permissions. Understanding how access is managed can help you request the right permissions from your administrator. If you cannot access a feature in Amazon MWAA, see Troubleshooting Amazon Managed Workflows for Apache Airflow identity and access. Service administrator – If you're in charge of Amazon MWAA resources at your company, you probably have full access to Amazon MWAA. It's your job to determine which Amazon MWAA features and resources your service users should access. You must then submit requests to your IAM administrator to change the permissions of your service users. Review the information on this page to understand the basic concepts of IAM. To learn more about how your company can use IAM with Amazon MWAA, see How Amazon MWAA works with IAM. IAM administrator – If you're an IAM administrator, you might want to learn details about how you can write policies to manage access to Amazon MWAA. To view example Amazon MWAA identity- based policies that you can use in IAM, see Amazon MWAA identity-based policy examples. Authenticating With Identities Authentication is how you sign in to AWS using your identity credentials. You must be authenticated (signed in to AWS) as the AWS account root user, as an IAM user, or by assuming an IAM role. You can sign in to AWS as a federated identity by using credentials provided through an identity source. AWS IAM Identity Center (IAM Identity Center) users, your company's single sign-on authentication, and your Google or Facebook credentials are examples of federated identities. When you sign in as a federated identity, your administrator previously set up identity federation using IAM roles. When you access AWS by using federation, you are indirectly assuming a role. Audience 444 Amazon Managed Workflows for Apache Airflow User Guide Depending on the type of user you are, you can sign in to the AWS Management Console or the AWS access portal. For more information about signing in to AWS, see How to sign in to your AWS account in the AWS Sign-In User Guide. If you access AWS programmatically, AWS provides a software development kit (SDK) and a command line interface (CLI) to cryptographically sign your requests by using your credentials. If you don't use AWS tools, you must sign requests yourself. For more information about using the recommended method to sign requests yourself, see AWS Signature Version 4 for API requests in the IAM User Guide. Regardless of the authentication method that you use, you might be required to provide additional security information. For example, AWS recommends that you use multi-factor authentication (MFA) to increase the security of your account. To learn more, see Multi-factor authentication in the AWS IAM Identity Center User Guide and AWS Multi-factor authentication in IAM in the IAM User Guide. AWS account root user When you create an AWS account, you begin with one sign-in identity that has complete access to all AWS services and resources in the account. This identity is called the AWS account root user and is accessed by signing in with the email address and password that you used to create the account. We strongly recommend that you don't use the root user for your everyday tasks. Safeguard your root user credentials and use them to perform the tasks that only the root user can perform. For the complete list of tasks that require you to sign in as the root user, see Tasks that require root user credentials in the IAM User Guide. IAM Users and Groups An IAM user is an identity within your AWS account that has specific permissions for a single person or application. Where possible, we recommend relying on temporary credentials instead of creating IAM users who have long-term credentials such as passwords and access keys. However, if you have specific use cases that require long-term credentials with IAM users, we recommend that you rotate access keys. For more information, see Rotate access keys regularly for use cases that require long- term credentials in the IAM User Guide. An IAM group is an identity that specifies a collection of IAM users. You can't sign in as a group. You can use groups to specify permissions for multiple users at a time. Groups make permissions easier to manage for large sets of users. For example, you could have a group named IAMAdmins and give that group permissions to administer IAM resources. Authenticating With Identities 445 Amazon Managed Workflows for Apache Airflow User Guide Users are different from roles. A user is uniquely associated with one person or application, but a role is intended to be assumable by anyone who needs it. Users have permanent |
amazon-mwaa-user-guide-125 | amazon-mwaa-user-guide.pdf | 125 | identity that specifies a collection of IAM users. You can't sign in as a group. You can use groups to specify permissions for multiple users at a time. Groups make permissions easier to manage for large sets of users. For example, you could have a group named IAMAdmins and give that group permissions to administer IAM resources. Authenticating With Identities 445 Amazon Managed Workflows for Apache Airflow User Guide Users are different from roles. A user is uniquely associated with one person or application, but a role is intended to be assumable by anyone who needs it. Users have permanent long-term credentials, but roles provide temporary credentials. To learn more, see Use cases for IAM users in the IAM User Guide. IAM Roles An IAM role is an identity within your AWS account that has specific permissions. It is similar to an IAM user, but is not associated with a specific person. To temporarily assume an IAM role in the AWS Management Console, you can switch from a user to an IAM role (console). You can assume a role by calling an AWS CLI or AWS API operation or by using a custom URL. For more information about methods for using roles, see Methods to assume a role in the IAM User Guide. IAM roles with temporary credentials are useful in the following situations: • Federated user access – To assign permissions to a federated identity, you create a role and define permissions for the role. When a federated identity authenticates, the identity is associated with the role and is granted the permissions that are defined by the role. For information about roles for federation, see Create a role for a third-party identity provider (federation) in the IAM User Guide. If you use IAM Identity Center, you configure a permission set. To control what your identities can access after they authenticate, IAM Identity Center correlates the permission set to a role in IAM. For information about permissions sets, see Permission sets in the AWS IAM Identity Center User Guide. • Temporary IAM user permissions – An IAM user or role can assume an IAM role to temporarily take on different permissions for a specific task. • Cross-account access – You can use an IAM role to allow someone (a trusted principal) in a different account to access resources in your account. Roles are the primary way to grant cross- account access. However, with some AWS services, you can attach a policy directly to a resource (instead of using a role as a proxy). To learn the difference between roles and resource-based policies for cross-account access, see Cross account resource access in IAM in the IAM User Guide. • Cross-service access – Some AWS services use features in other AWS services. For example, when you make a call in a service, it's common for that service to run applications in Amazon EC2 or store objects in Amazon S3. A service might do this using the calling principal's permissions, using a service role, or using a service-linked role. • Forward access sessions (FAS) – When you use an IAM user or role to perform actions in AWS, you are considered a principal. When you use some services, you might perform an action that then initiates another action in a different service. FAS uses the permissions of the Authenticating With Identities 446 Amazon Managed Workflows for Apache Airflow User Guide principal calling an AWS service, combined with the requesting AWS service to make requests to downstream services. FAS requests are only made when a service receives a request that requires interactions with other AWS services or resources to complete. In this case, you must have permissions to perform both actions. For policy details when making FAS requests, see Forward access sessions. • Service role – A service role is an IAM role that a service assumes to perform actions on your behalf. An IAM administrator can create, modify, and delete a service role from within IAM. For more information, see Create a role to delegate permissions to an AWS service in the IAM User Guide. • Service-linked role – A service-linked role is a type of service role that is linked to an AWS service. The service can assume the role to perform an action on your behalf. Service-linked roles appear in your AWS account and are owned by the service. An IAM administrator can view, but not edit the permissions for service-linked roles. • Applications running on Amazon EC2 – You can use an IAM role to manage temporary credentials for applications that are running on an EC2 instance and making AWS CLI or AWS API requests. This is preferable to storing access keys within the EC2 instance. To assign an AWS role to an EC2 instance and make it available to |
amazon-mwaa-user-guide-126 | amazon-mwaa-user-guide.pdf | 126 | AWS service. The service can assume the role to perform an action on your behalf. Service-linked roles appear in your AWS account and are owned by the service. An IAM administrator can view, but not edit the permissions for service-linked roles. • Applications running on Amazon EC2 – You can use an IAM role to manage temporary credentials for applications that are running on an EC2 instance and making AWS CLI or AWS API requests. This is preferable to storing access keys within the EC2 instance. To assign an AWS role to an EC2 instance and make it available to all of its applications, you create an instance profile that is attached to the instance. An instance profile contains the role and enables programs that are running on the EC2 instance to get temporary credentials. For more information, see Use an IAM role to grant permissions to applications running on Amazon EC2 instances in the IAM User Guide. Managing Access Using Policies You control access in AWS by creating policies and attaching them to AWS identities or resources. A policy is an object in AWS that, when associated with an identity or resource, defines their permissions. AWS evaluates these policies when a principal (user, root user, or role session) makes a request. Permissions in the policies determine whether the request is allowed or denied. Most policies are stored in AWS as JSON documents. For more information about the structure and contents of JSON policy documents, see Overview of JSON policies in the IAM User Guide. Administrators can use AWS JSON policies to specify who has access to what. That is, which principal can perform actions on what resources, and under what conditions. By default, users and roles have no permissions. To grant users permission to perform actions on the resources that they need, an IAM administrator can create IAM policies. The administrator can then add the IAM policies to roles, and users can assume the roles. Managing Access Using Policies 447 Amazon Managed Workflows for Apache Airflow User Guide IAM policies define permissions for an action regardless of the method that you use to perform the operation. For example, suppose that you have a policy that allows the iam:GetRole action. A user with that policy can get role information from the AWS Management Console, the AWS CLI, or the AWS API. Identity-Based Policies Identity-based policies are JSON permissions policy documents that you can attach to an identity, such as an IAM user, group of users, or role. These policies control what actions users and roles can perform, on which resources, and under what conditions. To learn how to create an identity-based policy, see Define custom IAM permissions with customer managed policies in the IAM User Guide. Identity-based policies can be further categorized as inline policies or managed policies. Inline policies are embedded directly into a single user, group, or role. Managed policies are standalone policies that you can attach to multiple users, groups, and roles in your AWS account. Managed policies include AWS managed policies and customer managed policies. To learn how to choose between a managed policy or an inline policy, see Choose between managed policies and inline policies in the IAM User Guide. Resource-Based Policies Resource-based policies are JSON policy documents that you attach to a resource. Examples of resource-based policies are IAM role trust policies and Amazon S3 bucket policies. In services that support resource-based policies, service administrators can use them to control access to a specific resource. For the resource where the policy is attached, the policy defines what actions a specified principal can perform on that resource and under what conditions. You must specify a principal in a resource-based policy. Principals can include accounts, users, roles, federated users, or AWS services. Resource-based policies are inline policies that are located in that service. You can't use AWS managed policies from IAM in a resource-based policy. Access Control Lists (ACLs) Access control lists (ACLs) control which principals (account members, users, or roles) have permissions to access a resource. ACLs are similar to resource-based policies, although they do not use the JSON policy document format. Managing Access Using Policies 448 Amazon Managed Workflows for Apache Airflow User Guide Amazon S3, AWS WAF, and Amazon VPC are examples of services that support ACLs. To learn more about ACLs, see Access control list (ACL) overview in the Amazon Simple Storage Service Developer Guide. Other Policy Types AWS supports additional, less-common policy types. These policy types can set the maximum permissions granted to you by the more common policy types. • Permissions boundaries – A permissions boundary is an advanced feature in which you set the maximum permissions that an identity-based policy can grant to an IAM entity (IAM user or role). You can set a permissions boundary for an entity. |
amazon-mwaa-user-guide-127 | amazon-mwaa-user-guide.pdf | 127 | Amazon S3, AWS WAF, and Amazon VPC are examples of services that support ACLs. To learn more about ACLs, see Access control list (ACL) overview in the Amazon Simple Storage Service Developer Guide. Other Policy Types AWS supports additional, less-common policy types. These policy types can set the maximum permissions granted to you by the more common policy types. • Permissions boundaries – A permissions boundary is an advanced feature in which you set the maximum permissions that an identity-based policy can grant to an IAM entity (IAM user or role). You can set a permissions boundary for an entity. The resulting permissions are the intersection of an entity's identity-based policies and its permissions boundaries. Resource-based policies that specify the user or role in the Principal field are not limited by the permissions boundary. An explicit deny in any of these policies overrides the allow. For more information about permissions boundaries, see Permissions boundaries for IAM entities in the IAM User Guide. • Service control policies (SCPs) – SCPs are JSON policies that specify the maximum permissions for an organization or organizational unit (OU) in AWS Organizations. AWS Organizations is a service for grouping and centrally managing multiple AWS accounts that your business owns. If you enable all features in an organization, then you can apply service control policies (SCPs) to any or all of your accounts. The SCP limits permissions for entities in member accounts, including each AWS account root user. For more information about Organizations and SCPs, see Service control policies in the AWS Organizations User Guide. • Resource control policies (RCPs) – RCPs are JSON policies that you can use to set the maximum available permissions for resources in your accounts without updating the IAM policies attached to each resource that you own. The RCP limits permissions for resources in member accounts and can impact the effective permissions for identities, including the AWS account root user, regardless of whether they belong to your organization. For more information about Organizations and RCPs, including a list of AWS services that support RCPs, see Resource control policies (RCPs) in the AWS Organizations User Guide. • Session policies – Session policies are advanced policies that you pass as a parameter when you programmatically create a temporary session for a role or federated user. The resulting session's permissions are the intersection of the user or role's identity-based policies and the session policies. Permissions can also come from a resource-based policy. An explicit deny in any of these policies overrides the allow. For more information, see Session policies in the IAM User Guide. Managing Access Using Policies 449 Amazon Managed Workflows for Apache Airflow User Guide Multiple Policy Types When multiple types of policies apply to a request, the resulting permissions are more complicated to understand. To learn how AWS determines whether to allow a request when multiple policy types are involved, see Policy evaluation logic in the IAM User Guide. Allowing users to view their own permissions This example shows how you might create a policy that allows IAM users to view the inline and managed policies that are attached to their user identity. This policy includes permissions to complete this action on the console or programmatically using the AWS CLI or AWS API. { "Version": "2012-10-17", "Statement": [ { "Sid": "ViewOwnUserInfo", "Effect": "Allow", "Action": [ "iam:GetUserPolicy", "iam:ListGroupsForUser", "iam:ListAttachedUserPolicies", "iam:ListUserPolicies", "iam:GetUser" ], "Resource": ["arn:aws:iam::*:user/${aws:username}"] }, { "Sid": "NavigateInConsole", "Effect": "Allow", "Action": [ "iam:GetGroupPolicy", "iam:GetPolicyVersion", "iam:GetPolicy", "iam:ListAttachedGroupPolicies", "iam:ListGroupPolicies", "iam:ListPolicyVersions", "iam:ListPolicies", "iam:ListUsers" ], "Resource": "*" } Allowing users to view their own permissions 450 Amazon Managed Workflows for Apache Airflow User Guide ] } Troubleshooting Amazon Managed Workflows for Apache Airflow identity and access Use the following information to help you diagnose and fix common issues that you might encounter when working with Amazon MWAA and IAM. I am not authorized to perform an action in Amazon MWAA If the AWS Management Console tells you that you're not authorized to perform an action, then you must contact your administrator for assistance. Your administrator is the person that provided you with your user name and password. I am not authorized to perform iam:PassRole If you receive an error that you're not authorized to perform the iam:PassRole action, your policies must be updated to allow you to pass a role to Amazon MWAA. Some AWS services allow you to pass an existing role to that service instead of creating a new service role or service-linked role. To do this, you must have permissions to pass the role to the service. The following example error occurs when an IAM user named marymajor tries to use the console to perform an action in Amazon MWAA. However, the action requires the service to have permissions that are granted by a service role. Mary does not have permissions to pass |
amazon-mwaa-user-guide-128 | amazon-mwaa-user-guide.pdf | 128 | iam:PassRole action, your policies must be updated to allow you to pass a role to Amazon MWAA. Some AWS services allow you to pass an existing role to that service instead of creating a new service role or service-linked role. To do this, you must have permissions to pass the role to the service. The following example error occurs when an IAM user named marymajor tries to use the console to perform an action in Amazon MWAA. However, the action requires the service to have permissions that are granted by a service role. Mary does not have permissions to pass the role to the service. User: arn:aws:iam::123456789012:user/marymajor is not authorized to perform: iam:PassRole In this case, Mary's policies must be updated to allow her to perform the iam:PassRole action. If you need help, contact your AWS administrator. Your administrator is the person who provided you with your sign-in credentials. Troubleshooting Amazon Managed Workflows for Apache Airflow identity and access 451 Amazon Managed Workflows for Apache Airflow User Guide I want to allow people outside of my AWS account to access my Amazon MWAA resources You can create a role that users in other accounts or people outside of your organization can use to access your resources. You can specify who is trusted to assume the role. For services that support resource-based policies or access control lists (ACLs), you can use those policies to grant people access to your resources. To learn more, consult the following: • To learn whether Amazon MWAA supports these features, see How Amazon MWAA works with IAM. • To learn how to provide access to your resources across AWS accounts that you own, see Providing access to an IAM user in another AWS account that you own in the IAM User Guide. • To learn how to provide access to your resources to third-party AWS accounts, see Providing access to AWS accounts owned by third parties in the IAM User Guide. • To learn how to provide access through identity federation, see Providing access to externally authenticated users (identity federation) in the IAM User Guide. • To learn the difference between using roles and resource-based policies for cross-account access, see Cross account resource access in IAM in the IAM User Guide. How Amazon MWAA works with IAM Amazon MWAA uses IAM identity-based policies to grant permissions to Amazon MWAA actions and resources. For recommended examples of custom IAM policies you can use to control access to your Amazon MWAA resources, see the section called “Accessing an Amazon MWAA environment”. To get a high-level view of how Amazon MWAA and other AWS services work with IAM, see AWS Services That Work with IAM in the IAM User Guide. Amazon MWAA identity-based policies With IAM identity-based policies, you can specify allowed or denied actions and resources, as well as the conditions under which actions are allowed or denied. Amazon MWAA supports specific actions, resources, and condition keys. The following steps show how you can create a new JSON policy using the IAM console. This policy provides read-only access to your Amazon MWAA resources. How Amazon MWAA works with IAM 452 Amazon Managed Workflows for Apache Airflow User Guide To use the JSON policy editor to create a policy 1. Sign in to the AWS Management Console and open the IAM console at https:// console.aws.amazon.com/iam/. 2. In the navigation pane on the left, choose Policies. If this is your first time choosing Policies, the Welcome to Managed Policies page appears. Choose Get Started. 3. At the top of the page, choose Create policy. 4. 5. In the Policy editor section, choose the JSON option. Enter the following JSON policy document: { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "airflow:ListEnvironments", "airflow:GetEnvironment", "airflow:ListTagsForResource" ], "Resource": "*" } ] } 6. Choose Next. Note You can switch between the Visual and JSON editor options anytime. However, if you make changes or choose Next in the Visual editor, IAM might restructure your policy to optimize it for the visual editor. For more information, see Policy restructuring in the IAM User Guide. 7. On the Review and create page, enter a Policy name and a Description (optional) for the policy that you are creating. Review Permissions defined in this policy to see the permissions that are granted by your policy. How Amazon MWAA works with IAM 453 Amazon Managed Workflows for Apache Airflow User Guide 8. Choose Create policy to save your new policy. To learn about all of the elements that you use in a JSON policy, see IAM JSON Policy Elements Reference in the IAM User Guide. Actions Administrators can use AWS JSON policies to specify who has access to what. That is, which principal can perform actions on what resources, and under what conditions. The Action element of a |
amazon-mwaa-user-guide-129 | amazon-mwaa-user-guide.pdf | 129 | are creating. Review Permissions defined in this policy to see the permissions that are granted by your policy. How Amazon MWAA works with IAM 453 Amazon Managed Workflows for Apache Airflow User Guide 8. Choose Create policy to save your new policy. To learn about all of the elements that you use in a JSON policy, see IAM JSON Policy Elements Reference in the IAM User Guide. Actions Administrators can use AWS JSON policies to specify who has access to what. That is, which principal can perform actions on what resources, and under what conditions. The Action element of a JSON policy describes the actions that you can use to allow or deny access in a policy. Policy actions usually have the same name as the associated AWS API operation. There are some exceptions, such as permission-only actions that don't have a matching API operation. There are also some operations that require multiple actions in a policy. These additional actions are called dependent actions. Include actions in a policy to grant permissions to perform the associated operation. Policy statements must include either an Action element or a NotAction element. The Action element lists the actions allowed by the policy. The NotAction element lists the actions that are not allowed. The actions defined for Amazon MWAA reflect tasks that you can perform using Amazon MWAA. Policy actions in Detective have the following prefix: airflow:. You can also use wildcards (*) to specify multiple actions. Instead of listing these actions separately, you can grant access to all actions that end with the word, for example, environment. To see a list of Amazon MWAA actions, see Actions Defined by Amazon Managed Workflows for Apache Airflow in the IAM User Guide. Amazon MWAA identity-based policy examples To view the Amazon MWAA policies, see Managing access to an Amazon MWAA environment. By default, IAM users and roles don't have permission to create or modify Amazon MWAA resources. They also can't perform tasks using the AWS Management Console, AWS CLI, or AWS API. An IAM administrator must create IAM policies that grant users and roles permission to perform specific API operations on the specified resources they need. The administrator then attaches those policies to the IAM users or groups that require those permissions. How Amazon MWAA works with IAM 454 Amazon Managed Workflows for Apache Airflow User Guide Important We recommend using IAM roles and temporary credentials to provide access to your Amazon MWAA resources. Avoiding attaching permission poicies directly to your IAM users. To learn how to create an IAM identity-based policy using these example JSON policy documents, see Creating Policies on the JSON Tab in the IAM User Guide. Topics • Policy best practices • Using the Amazon MWAA console • Allowing users to view their own permissions Policy best practices Identity-based policies determine whether someone can create, access, or delete Amazon MWAA resources in your account. These actions can incur costs for your AWS account. When you create or edit identity-based policies, follow these guidelines and recommendations: • Get started with AWS managed policies and move toward least-privilege permissions – To get started granting permissions to your users and workloads, use the AWS managed policies that grant permissions for many common use cases. They are available in your AWS account. We recommend that you reduce permissions further by defining AWS customer managed policies that are specific to your use cases. For more information, see AWS managed policies or AWS managed policies for job functions in the IAM User Guide. • Apply least-privilege permissions – When you set permissions with IAM policies, grant only the permissions required to perform a task. You do this by defining the actions that can be taken on specific resources under specific conditions, also known as least-privilege permissions. For more information about using IAM to apply permissions, see Policies and permissions in IAM in the IAM User Guide. • Use conditions in IAM policies to further restrict access – You can add a condition to your policies to limit access to actions and resources. For example, you can write a policy condition to specify that all requests must be sent using SSL. You can also use conditions to grant access to service actions if they are used through a specific AWS service, such as AWS CloudFormation. For more information, see IAM JSON policy elements: Condition in the IAM User Guide. How Amazon MWAA works with IAM 455 Amazon Managed Workflows for Apache Airflow User Guide • Use IAM Access Analyzer to validate your IAM policies to ensure secure and functional permissions – IAM Access Analyzer validates new and existing policies so that the policies adhere to the IAM policy language (JSON) and IAM best practices. IAM Access Analyzer provides more than 100 policy checks and actionable recommendations to help |
amazon-mwaa-user-guide-130 | amazon-mwaa-user-guide.pdf | 130 | access to service actions if they are used through a specific AWS service, such as AWS CloudFormation. For more information, see IAM JSON policy elements: Condition in the IAM User Guide. How Amazon MWAA works with IAM 455 Amazon Managed Workflows for Apache Airflow User Guide • Use IAM Access Analyzer to validate your IAM policies to ensure secure and functional permissions – IAM Access Analyzer validates new and existing policies so that the policies adhere to the IAM policy language (JSON) and IAM best practices. IAM Access Analyzer provides more than 100 policy checks and actionable recommendations to help you author secure and functional policies. For more information, see Validate policies with IAM Access Analyzer in the IAM User Guide. • Require multi-factor authentication (MFA) – If you have a scenario that requires IAM users or a root user in your AWS account, turn on MFA for additional security. To require MFA when API operations are called, add MFA conditions to your policies. For more information, see Secure API access with MFA in the IAM User Guide. For more information about best practices in IAM, see Security best practices in IAM in the IAM User Guide. Using the Amazon MWAA console To use the Amazon MWAA console, the user or role must have access to the relevant actions, which match corresponding actions in the API. To view the Amazon MWAA policies, see Managing access to an Amazon MWAA environment. Allowing users to view their own permissions This example shows how you might create a policy that allows IAM users to view the inline and managed policies that are attached to their user identity. This policy includes permissions to complete this action on the console or programmatically using the AWS CLI or AWS API. { "Version": "2012-10-17", "Statement": [ { "Sid": "ViewOwnUserInfo", "Effect": "Allow", "Action": [ "iam:GetUserPolicy", "iam:ListGroupsForUser", "iam:ListAttachedUserPolicies", "iam:ListUserPolicies", "iam:GetUser" ], How Amazon MWAA works with IAM 456 Amazon Managed Workflows for Apache Airflow User Guide "Resource": ["arn:aws:iam::*:user/${aws:username}"] }, { "Sid": "NavigateInConsole", "Effect": "Allow", "Action": [ "iam:GetGroupPolicy", "iam:GetPolicyVersion", "iam:GetPolicy", "iam:ListAttachedGroupPolicies", "iam:ListGroupPolicies", "iam:ListPolicyVersions", "iam:ListPolicies", "iam:ListUsers" ], "Resource": "*" } ] } Compliance Validation for Amazon Managed Workflows for Apache Airflow To learn whether an AWS service is within the scope of specific compliance programs, see AWS services in Scope by Compliance Program and choose the compliance program that you are interested in. For general information, see AWS Compliance Programs. You can download third-party audit reports using AWS Artifact. For more information, see Downloading Reports in AWS Artifact. Your compliance responsibility when using AWS services is determined by the sensitivity of your data, your company's compliance objectives, and applicable laws and regulations. AWS provides the following resources to help with compliance: • Security Compliance & Governance – These solution implementation guides discuss architectural considerations and provide steps for deploying security and compliance features. • HIPAA Eligible Services Reference – Lists HIPAA eligible services. Not all AWS services are HIPAA eligible. • AWS Compliance Resources – This collection of workbooks and guides might apply to your industry and location. Compliance Validation 457 Amazon Managed Workflows for Apache Airflow User Guide • AWS Customer Compliance Guides – Understand the shared responsibility model through the lens of compliance. The guides summarize the best practices for securing AWS services and map the guidance to security controls across multiple frameworks (including National Institute of Standards and Technology (NIST), Payment Card Industry Security Standards Council (PCI), and International Organization for Standardization (ISO)). • Evaluating Resources with Rules in the AWS Config Developer Guide – The AWS Config service assesses how well your resource configurations comply with internal practices, industry guidelines, and regulations. • AWS Security Hub – This AWS service provides a comprehensive view of your security state within AWS. Security Hub uses security controls to evaluate your AWS resources and to check your compliance against security industry standards and best practices. For a list of supported services and controls, see Security Hub controls reference. • Amazon GuardDuty – This AWS service detects potential threats to your AWS accounts, workloads, containers, and data by monitoring your environment for suspicious and malicious activities. GuardDuty can help you address various compliance requirements, like PCI DSS, by meeting intrusion detection requirements mandated by certain compliance frameworks. • AWS Audit Manager – This AWS service helps you continuously audit your AWS usage to simplify how you manage risk and compliance with regulations and industry standards. Resilience in Amazon Managed Workflows for Apache Airflow The AWS global infrastructure is built around AWS Regions and Availability Zones. Regions provide multiple physically separated and isolated Availability Zones, which are connected through low-latency, high-throughput, and highly redundant networking. With Availability Zones, you can design and operate applications and databases that automatically fail over between zones without interruption. Availability Zones are more highly available, fault tolerant, |
amazon-mwaa-user-guide-131 | amazon-mwaa-user-guide.pdf | 131 | requirements mandated by certain compliance frameworks. • AWS Audit Manager – This AWS service helps you continuously audit your AWS usage to simplify how you manage risk and compliance with regulations and industry standards. Resilience in Amazon Managed Workflows for Apache Airflow The AWS global infrastructure is built around AWS Regions and Availability Zones. Regions provide multiple physically separated and isolated Availability Zones, which are connected through low-latency, high-throughput, and highly redundant networking. With Availability Zones, you can design and operate applications and databases that automatically fail over between zones without interruption. Availability Zones are more highly available, fault tolerant, and scalable than traditional single or multiple data center infrastructures. For more information about AWS Regions and Availability Zones, see AWS Global Infrastructure. Infrastructure Security in Amazon MWAA As a managed service, Amazon Managed Workflows for Apache Airflow is protected by AWS global network security. For information about AWS security services and how AWS protects infrastructure, see AWS Cloud Security. To design your AWS environment using the best practices Resilience 458 Amazon Managed Workflows for Apache Airflow User Guide for infrastructure security, see Infrastructure Protection in Security Pillar AWS Well‐Architected Framework. You use AWS published API calls to access Amazon MWAA through the network. Clients must support the following: • Transport Layer Security (TLS). We require TLS 1.2 and recommend TLS 1.3. • Cipher suites with perfect forward secrecy (PFS) such as DHE (Ephemeral Diffie-Hellman) or ECDHE (Elliptic Curve Ephemeral Diffie-Hellman). Most modern systems such as Java 7 and later support these modes. Additionally, requests must be signed by using an access key ID and a secret access key that is associated with an IAM principal. Or you can use the AWS Security Token Service (AWS STS) to generate temporary security credentials to sign requests. Configuration and Vulnerability Analysis in Amazon MWAA Configuration and IT controls are a shared responsibility between AWS and you, our customer. Amazon Managed Workflows for Apache Airflow periodically patches and upgrades Apache Airflow on your environments. You should ensure that the appropriate access policies are used for your VPCs. For more details, see the following resources: • Compliance Validation for Amazon Managed Workflows for Apache Airflow • Shared Responsibility Model • Amazon Web Services: Overview of Security Processes • Infrastructure Security in Amazon MWAA • Security best practices on Amazon MWAA Security best practices on Amazon MWAA Amazon MWAA provides a number of security features to consider as you develop and implement your own security policies. The following best practices are general guidelines and don’t represent a complete security solution. Because these best practices might not be appropriate or sufficient for your environment, treat them as helpful considerations rather than prescriptions. Configuration and Vulnerability Analysis 459 Amazon Managed Workflows for Apache Airflow User Guide • Use least-permissive permission policies. Grant permissions to only the resources or actions that users need to perform tasks. • Use AWS CloudTrail to monitor user activity in your account. • Ensure that the Amazon S3 bucket policy and object ACLs grant permissions to the users from the associated Amazon MWAA environment to put objects into the bucket. This ensures that users with permissions to add workflows to the bucket also have permissions to run the workflows in Airflow. • Use the Amazon S3 buckets associated with Amazon MWAA environments. Your Amazon S3 bucket can be any name. Do not store other objects in the bucket, or use the bucket with another service. Security best practices in Apache Airflow Apache Airflow is not multi-tenant. While there are access control measures to limit some features to specific users, which Amazon MWAA implements, DAG creators do have the ability to write DAGs that can change Apache Airflow user privileges and interact with the underlying metadatabase. We recommend the following steps when working with Apache Airflow on Amazon MWAA to ensure your environment's metadatabase and DAGs are secure. • Use separate environments for separate teams with DAG writing access, or the ability to add files to your Amazon S3 /dags folder, assuming anything accessible by the Amazon MWAA Execution Role or Apache Airflow connections will also be accessible to users who can write to the environment. • Do not provide direct Amazon S3 DAGs folder access. Instead, use CI/CD tools to write DAGs to Amazon S3, with a validation step ensuring that the DAG code meets your team's security guidelines. • Prevent user access to your environment's Amazon S3 bucket. Instead, use a DAG factory that generates DAGs based on a YAML, JSON, or other definition file stored in a separate location from your Amazon MWAA Amazon S3 bucket where you store DAGs. • Store secrets in Secrets Manager. While this will not prevent users who can write DAGs from reading secrets, it will prevent them from modifying the secrets that your environment uses. |
amazon-mwaa-user-guide-132 | amazon-mwaa-user-guide.pdf | 132 | folder access. Instead, use CI/CD tools to write DAGs to Amazon S3, with a validation step ensuring that the DAG code meets your team's security guidelines. • Prevent user access to your environment's Amazon S3 bucket. Instead, use a DAG factory that generates DAGs based on a YAML, JSON, or other definition file stored in a separate location from your Amazon MWAA Amazon S3 bucket where you store DAGs. • Store secrets in Secrets Manager. While this will not prevent users who can write DAGs from reading secrets, it will prevent them from modifying the secrets that your environment uses. Security best practices in Apache Airflow 460 Amazon Managed Workflows for Apache Airflow User Guide Detecting changes to Apache Airflow user privileges You can use CloudWatch Logs Insights to detect occurences of DAGs changing Apache Airflow user privileges. To do so, you can use an EventBridge scheduled rule, a Lambda function, and CloudWatch Logs Insights to deliver notifications to CloudWatch metrics whenever one of your DAGs changes Apache Airflow user privileges. Prerequisites To complete the following steps, you will need the following: • An Amazon MWAA environment with all Apache Airflow log types enabled at the INFO log level. For more information, see the section called “Viewing Airflow logs”. To configure notifications for changes to Apache Airflow user privileges 1. Create a Lambda function that runs the following CloudWatch Logs Insights query string against the five Amazon MWAA environment log groups (DAGProcessing, Scheduler, Task, WebServer, and Worker). fields @log, @timestamp, @message | filter @message like "add-role" | stats count() by @log 2. Create an EventBridge rule that runs on a schedule, with the Lambda function you created in the previous step as the rule's target. Configure your schedule using a cron or rate expression to run at regular intervals. Security best practices in Apache Airflow 461 Amazon Managed Workflows for Apache Airflow User Guide Apache Airflow versions on Amazon Managed Workflows for Apache Airflow This topic describes the Apache Airflow versions Amazon Managed Workflows for Apache Airflow supports, and best-practices for upgrading to the latest version. Topics • About Amazon MWAA versions • Latest version • Apache Airflow versions • Apache Airflow components • Upgrading the Apache Airflow version • Apache Airflow deprecated versions • Apache Airflow version support and FAQ About Amazon MWAA versions Amazon MWAA builds container images that bundle Apache Airflow releases with other common binaries and Python libraries. The image uses the Apache Airflow base install for the version you specify. When you create an environment, you specify an image version to use. Once an environment is created, it keeps using the specified image version until you upgrade it to a later version. Latest version Amazon MWAA supports more than one Apache Airflow version. If you do not specify an image version when you create an environment, Amazon MWAA creates an environment using the latest supported version of Apache Airflow. Apache Airflow versions The following Apache Airflow versions are supported on Amazon Managed Workflows for Apache Airflow. About Amazon MWAA versions 462 Amazon Managed Workflows for Apache Airflow User Guide Note • Beginning with Apache Airflow v2.2.2, Amazon MWAA supports installing Python requirements, provider packages, and custom plugins directly on the Apache Airflow web server. • Beginning with Apache Airflow v2.7.2, your requirements file must include a -- constraint statement. If you do not provide a constraint, Amazon MWAA will specify one for you to ensure the packages listed in your requirements are compatible with the version of Apache Airflow you are using. For more information on setting up constraints in your requirements file, see Installing Python dependencies. Apache Airflow version Apache Airflow guide Apache Airflow constraints Python version v2.10.3 v2.10.1 v2.9.2 v2.8.1 v2.7.2 v2.6.3 Apache Airflow v2.10.3 reference Apache Airflow v2.10.3 constraints Python 3.11 guide file Apache Airflow v2.10.1 reference Apache Airflow v2.10.1 constraints Python 3.11 guide file Apache Airflow v2.9.2 reference guide Apache Airflow v2.9.2 constraints file Python 3.11 Apache Airflow v2.8.1 reference guide Apache Airflow v2.8.1 constraints file Python 3.11 Apache Airflow v2.7.2 reference guide Apache Airflow v2.7.2 constraints file Python 3.11 Apache Airflow v2.6.3 reference guide Apache Airflow v2.6.3 constraints file Python 3.10 Apache Airflow versions 463 Amazon Managed Workflows for Apache Airflow User Guide Apache Airflow version Apache Airflow guide Apache Airflow constraints Python version v2.5.1 v2.4.3 Apache Airflow v2.5.1 reference guide Apache Airflow v2.5.1 constraints file Python 3.10 Apache Airflow v2.4.3 reference guide Apache Airflow v2.4.3 constraints file Python 3.10 For more information about migrating your self-managed Apache Airflow deployments, or migrating an existing Amazon MWAA environment, including instructions for backing up your metadata database, see the Amazon MWAA Migration Guide. Apache Airflow components This section describes the number of Apache Airflow schedulers and workers available for each Apache Airflow version on Amazon MWAA, and provides a list of key |
amazon-mwaa-user-guide-133 | amazon-mwaa-user-guide.pdf | 133 | Airflow User Guide Apache Airflow version Apache Airflow guide Apache Airflow constraints Python version v2.5.1 v2.4.3 Apache Airflow v2.5.1 reference guide Apache Airflow v2.5.1 constraints file Python 3.10 Apache Airflow v2.4.3 reference guide Apache Airflow v2.4.3 constraints file Python 3.10 For more information about migrating your self-managed Apache Airflow deployments, or migrating an existing Amazon MWAA environment, including instructions for backing up your metadata database, see the Amazon MWAA Migration Guide. Apache Airflow components This section describes the number of Apache Airflow schedulers and workers available for each Apache Airflow version on Amazon MWAA, and provides a list of key Apache Airflow features, indicating the version that supports each feature. Schedulers Apache Airflow version Scheduler (default) Apache Airflow v2 and above 2 Workers Scheduler (min) Scheduler (max) 2 5 Airflow version Workers (min) Workers (max) Workers (default) Apache Airflow v2 1 25 10 Apache Airflow components 464 Amazon Managed Workflows for Apache Airflow User Guide Upgrading the Apache Airflow version Amazon MWAA supports minor version upgrades. This means you can upgrade your environment from version x.1.z to x.2.z, but no to a new major version, for example, from 1.y.z to 2.y.z. Note You cannot downgrade the Apache Airflow version for your environment. For more information, and detailed instructions on updating your workflow resources, and upgrading the environment to a new version, see the section called “Upgrading the version”. Apache Airflow deprecated versions The following table lists the deprecated versions of Apache Airflow in Amazon MWAA, along with initial release and end of support dates for each version. For more information about migrating to a newer version, see the Amazon MWAA Migration Guide. Apache Airflow version Apache Airflow release date Amazon MWAA availability date Amazon MWAA limited support Amazon MWAA end of support date date v1.10.12 August 25, 2020 November 24, 2020 August 21, 2023 v2.0.2 April 19, 2021 May 25, 2021 v2.2.2 November 15, 2021 January 27,2022 November 23, 2023 January 25, 2024 February 21, 2024 April 29, 2024 June 27, 2024 Apache Airflow version support and FAQ In accordance with the Apache Airflow community release process and version policy, Amazon MWAA is committed to supporting at least three minor versions of Apache Airflow at any given Upgrading the Apache Airflow version 465 Amazon Managed Workflows for Apache Airflow User Guide time. We will announce the end of support date of a given Apache Airflow minor version at least 90 days before the end of support date. Frequently asked questions Q: How long does Amazon MWAA support an Apache Airflow version? A: Amazon MWAA supports an Apache Airflow minor version for a minimum of 12 months after first being available. Q: Am I notified when support is ending for an Apache Airflow version on Amazon MWAA? A: Yes. If any Amazon MWAA environments in your account run the version nearing the end of support, Amazon MWAA sends out a notice through the AWS Health Dashboard with the end of support date. Q: What happens on the limited support date? A: On the limited support date, you can no longer create new Amazon MWAA environments with the associated version. Your existing environments will continue to be available until the end of support date. Q: What happens on the end of support date? A: On the end of support date, you will continue to be able to access your existing Amazon MWAA environments that run the associated, deprecated version of Apache Airflow at your own risk. For instructions on upgrading to a newer version of Apache Airflow on Amazon MWAA, see the Amazon MWAA Migration Guide. Important You are responsible for keeping your Amazon MWAA versions current. AWS urges all customers to upgrade their Amazon MWAA environments to the latest version in order to benefit from the most current security, privacy, and availability safeguards. If you operate your environment on an unsupported version or software past the deprecation date, referred to as the legacy version, you face a greater likelihood of security, privacy, and operational risks, including downtime events. By operating your Amazon MWAA environment on a legacy version, you confirm that you understand and knowingly assume these risks, and you agree to complete your upgrade to the latest version as soon as possible. Continued operation of your environment on a legacy version is subject to the agreement governing your use of the AWS services. Frequently asked questions 466 Amazon Managed Workflows for Apache Airflow User Guide Legacy versions are not considered generally available, and AWS no longer provides support for the legacy version. As a result, AWS may place limits on the access to or use of any legacy version at any time, if AWS determines that the legacy version poses a security or liability risk, or a risk of harm, to the services, AWS, its Affiliates, or any other third |
amazon-mwaa-user-guide-134 | amazon-mwaa-user-guide.pdf | 134 | as possible. Continued operation of your environment on a legacy version is subject to the agreement governing your use of the AWS services. Frequently asked questions 466 Amazon Managed Workflows for Apache Airflow User Guide Legacy versions are not considered generally available, and AWS no longer provides support for the legacy version. As a result, AWS may place limits on the access to or use of any legacy version at any time, if AWS determines that the legacy version poses a security or liability risk, or a risk of harm, to the services, AWS, its Affiliates, or any other third party. Your decision to continue running Your workloads on a legacy version might result in Your content becoming unavailable, corrupted, or unrecoverable. Environments running on a legacy version are subject to Service Level Agreement (SLA) exceptions. Environments, and related software, running on a legacy version might contain bugs, errors, defects, and harmful components. Accordingly, and notwithstanding any information to the contrary in the agreement, or the terms of service, AWS provides the legacy version as is. For more information about AWS's shared responsibility model, see Shared responsibility in the AWS Well-Architected Framework. Frequently asked questions 467 Amazon Managed Workflows for Apache Airflow User Guide Amazon Managed Workflows for Apache Airflow service endpoints and quotas Amazon Managed Workflows for Apache Airflow has the following service quotas and endpoints. Service quotas, also referred to as limits, are the maximum number of service resources or operations for your AWS account. Contents • Service endpoints • Service quotas • Increasing quotas Service endpoints To view a list of endpoints for Amazon MWAA, see Amazon Managed Workflows for Apache Airflow endpoints and quotas. Service quotas Quota name Description Default quota Adjustable Environments Workers per environment The maximum number of Amazon MWAA environme nts per account per Region. The maximum number of workers per Amazon MWAA environment. Web servers per environment The maximum number of web 10 25 5 Yes Yes Yes Service endpoints 468 Amazon Managed Workflows for Apache Airflow User Guide Quota name Description Default quota Adjustable servers per Amazon MWAA environment. Increasing quotas You can request an increase to an adjustable quota by submitting a quota increase request. Increasing quotas 469 Amazon Managed Workflows for Apache Airflow User Guide Amazon MWAA frequently asked questions This page describes common questions you may encounter when using Amazon Managed Workflows for Apache Airflow. Contents • Supported versions • What does Amazon MWAA support for Apache Airflow v2? • Why are older versions of Apache Airflow not supported? • What Python version should I use? • Use cases • When should I use AWS Step Functions vs. Amazon MWAA? • Environment specifications • How much task storage is available to each environment? • What is the default operating system used for Amazon MWAA environments? • Can I use a custom image for my Amazon MWAA environment? • Is Amazon MWAA HIPAA compliant? • Does Amazon MWAA support Spot Instances? • Does Amazon MWAA support a custom domain? • Can I SSH into my environment? • Why is a self-referencing rule required on the VPC security group? • Can I hide environments from different groups in IAM? • Can I store temporary data on the Apache Airflow Worker? • Can I specify more than 25 Apache Airflow Workers? • Does Amazon MWAA support shared Amazon VPCs or shared subnets? • Can I create or integrate custom Amazon SQS queues to manage task execution and workflow orchestration in Apache Airflow? • Metrics • What metrics are used to determine whether to scale Workers? • Can I create custom metrics in CloudWatch? • DAGs, Operators, Connections, and other questions 470 Amazon Managed Workflows for Apache Airflow User Guide • Can I use the PythonVirtualenvOperator? • How long does it take Amazon MWAA to recognize a new DAG file? • Why is my DAG file not picked up by Apache Airflow? • Can I remove a plugins.zip or requirements.txt from an environment? • Why don't I see my plugins in the Apache Airflow v2.0.2 Admin Plugins menu? • Can I use AWS Database Migration Service (DMS) Operators? • When I access the Airflow REST API using the AWS credentials, can I increase the throttling limit to more than 10 transactions per second (TPS)? Supported versions What does Amazon MWAA support for Apache Airflow v2? To learn what Amazon MWAA supports, see Apache Airflow versions on Amazon Managed Workflows for Apache Airflow. Why are older versions of Apache Airflow not supported? We are only supporting the latest (as of launch) Apache Airflow version Apache Airflow v1.10.12 due to security concerns with older versions. What Python version should I use? The following Apache Airflow versions are supported on Amazon Managed Workflows for Apache Airflow. Note • Beginning with Apache Airflow v2.2.2, Amazon MWAA |
amazon-mwaa-user-guide-135 | amazon-mwaa-user-guide.pdf | 135 | increase the throttling limit to more than 10 transactions per second (TPS)? Supported versions What does Amazon MWAA support for Apache Airflow v2? To learn what Amazon MWAA supports, see Apache Airflow versions on Amazon Managed Workflows for Apache Airflow. Why are older versions of Apache Airflow not supported? We are only supporting the latest (as of launch) Apache Airflow version Apache Airflow v1.10.12 due to security concerns with older versions. What Python version should I use? The following Apache Airflow versions are supported on Amazon Managed Workflows for Apache Airflow. Note • Beginning with Apache Airflow v2.2.2, Amazon MWAA supports installing Python requirements, provider packages, and custom plugins directly on the Apache Airflow web server. • Beginning with Apache Airflow v2.7.2, your requirements file must include a -- constraint statement. If you do not provide a constraint, Amazon MWAA will specify one for you to ensure the packages listed in your requirements are compatible with the version of Apache Airflow you are using. Supported versions 471 Amazon Managed Workflows for Apache Airflow User Guide For more information on setting up constraints in your requirements file, see Installing Python dependencies. Apache Airflow version Apache Airflow guide Apache Airflow constraints Python version v2.10.3 v2.10.1 v2.9.2 v2.8.1 v2.7.2 v2.6.3 v2.5.1 v2.4.3 Apache Airflow v2.10.3 reference Apache Airflow v2.10.3 constraints Python 3.11 guide file Apache Airflow v2.10.1 reference guide Apache Airflow v2.10.1 constraints file Python 3.11 Apache Airflow v2.9.2 reference guide Apache Airflow v2.9.2 constraints file Python 3.11 Apache Airflow v2.8.1 reference guide Apache Airflow v2.8.1 constraints file Python 3.11 Apache Airflow v2.7.2 reference guide Apache Airflow v2.7.2 constraints file Python 3.11 Apache Airflow v2.6.3 reference guide Apache Airflow v2.6.3 constraints file Python 3.10 Apache Airflow v2.5.1 reference guide Apache Airflow v2.5.1 constraints file Python 3.10 Apache Airflow v2.4.3 reference guide Apache Airflow v2.4.3 constraints file Python 3.10 For more information about migrating your self-managed Apache Airflow deployments, or migrating an existing Amazon MWAA environment, including instructions for backing up your metadata database, see the Amazon MWAA Migration Guide. Python version 472 Amazon Managed Workflows for Apache Airflow User Guide Use cases When should I use AWS Step Functions vs. Amazon MWAA? 1. You can use Step Functions to process individual customer orders, since Step Functions can scale to meet demand for one order or one million orders. 2. If you’re running an overnight workflow that processes the previous day’s orders, you can use Step Functions or Amazon MWAA. Amazon MWAA allows you an open source option to abstract the workflow from the AWS resources you're using. Environment specifications How much task storage is available to each environment? The task storage is limited to 20 GB, and is specified by Amazon ECS Fargate 1.4. The amount of RAM is determined by the environment class you specify. For more information about environment classes, see Configuring the Amazon MWAA environment class. What is the default operating system used for Amazon MWAA environments? Amazon MWAA environments are created on instances running Amazon Linux 2 for versions 2.6 and older, and on instances running Amazon Linux 2023 for versions 2.7 and newer. Can I use a custom image for my Amazon MWAA environment? Custom images are not supported. Amazon MWAA uses images that are built on Amazon Linux AMI. Amazon MWAA installs the additional requirements by running pip3 -r install for the requirements specified in the requirements.txt file you add to the Amazon S3 bucket for the environment. Is Amazon MWAA HIPAA compliant? Amazon MWAA is Health Insurance Portability and Accountability Act (HIPAA) eligible. If you have a HIPAA Business Associate Addendum (BAA) in place with AWS, you can use Amazon MWAA for workflows handling Protected Health Information (PHI) on environments created on, or after, November 14th, 2022. Use cases 473 Amazon Managed Workflows for Apache Airflow User Guide Does Amazon MWAA support Spot Instances? Amazon MWAA does not currently support on-demand Amazon EC2 Spot Instance types for Apache Airflow. However, an Amazon MWAA environment can trigger Spot Instances on, for example, Amazon EMR and Amazon EC2. Does Amazon MWAA support a custom domain? To be able to use a custom domain for your Amazon MWAA hostname, do one of the following: • For Amazon MWAA deployments with public web server access, you can use Amazon CloudFront with Lambda@Edge to direct traffic to your environment, and map a custom domain name to CloudFront. For more information and an example of setting up a custom domain for a public environment, see the Amazon MWAA custom domain for public web server sample in the Amazon MWAA examples GitHub repository. • For Amazon MWAA deployments with private web server access, see the section called “Setting up a custom domain”. Can I SSH into my environment? While SSH is not supported on a Amazon MWAA environment, it's possible to use a |
amazon-mwaa-user-guide-136 | amazon-mwaa-user-guide.pdf | 136 | with public web server access, you can use Amazon CloudFront with Lambda@Edge to direct traffic to your environment, and map a custom domain name to CloudFront. For more information and an example of setting up a custom domain for a public environment, see the Amazon MWAA custom domain for public web server sample in the Amazon MWAA examples GitHub repository. • For Amazon MWAA deployments with private web server access, see the section called “Setting up a custom domain”. Can I SSH into my environment? While SSH is not supported on a Amazon MWAA environment, it's possible to use a DAG to run bash commands using the BashOperator. For example: from airflow import DAG from airflow.operators.bash_operator import BashOperator from airflow.utils.dates import days_ago with DAG(dag_id="any_bash_command_dag", schedule_interval=None, catchup=False, start_date=days_ago(1)) as dag: cli_command = BashOperator( task_id="bash_command", bash_command="{{ dag_run.conf['command'] }}" ) To trigger the DAG in the Apache Airflow UI, use: { "command" : "your bash command"} Does Amazon MWAA support Spot Instances? 474 Amazon Managed Workflows for Apache Airflow User Guide Why is a self-referencing rule required on the VPC security group? By creating a self-referencing rule, you're restricting the source to the same security group in the VPC, and it's not open to all networks. To learn more, see the section called “Security in your VPC”. Can I hide environments from different groups in IAM? You can limit access by specifying an environment name in AWS Identity and Access Management, however, visibility filtering isn't available in the AWS console—if a user can see one environment, they can see all environments. Can I store temporary data on the Apache Airflow Worker? Your Apache Airflow Operators can store temporary data on the Workers. Apache Airflow Workers can access temporary files in the /tmp on the Fargate containers for your environment. Note Total task storage is limited to 20 GB, according to Amazon ECS Fargate 1.4. There's no guarantee that subsequent tasks will run on the same Fargate container instance, which might use a different /tmp folder. Can I specify more than 25 Apache Airflow Workers? Yes. Although you can specify up to 25 Apache Airflow workers on the Amazon MWAA console, you can configure up to 50 on an environment by requesting a quota increase. For more information, see Requesting a quota increase. Does Amazon MWAA support shared Amazon VPCs or shared subnets? Amazon MWAA does not support shared Amazon VPCs or shared subnets. The Amazon VPC you select when you create an environment should be owned by the account that is attempting to create the environment. However, you can route traffic from an Amazon VPC in the Amazon MWAA account to a shared VPC. For more information, and to see an example of routing traffic to a shared Amazon VPC, see Centralized outbound routing to the internet in the Amazon VPC Transit Gateways Guide. Self-referencing rule 475 Amazon Managed Workflows for Apache Airflow User Guide Can I create or integrate custom Amazon SQS queues to manage task execution and workflow orchestration in Apache Airflow? No, you cannot create, modify, or use custom Amazon SQS queues within Amazon MWAA. This is because Amazon MWAA automatically provisions and manages its own Amazon SQS queue for each Amazon MWAA environment. Metrics What metrics are used to determine whether to scale Workers? Amazon MWAA monitors the QueuedTasks and RunningTasks in CloudWatch to determine whether to scale Apache Airflow Workers on your environment. To learn more, see Monitoring and metrics. Can I create custom metrics in CloudWatch? Not on the CloudWatch console. However, you can create a DAG that writes custom metrics in CloudWatch. For more information, see the section called “Using a DAG to write custom metrics”. DAGs, Operators, Connections, and other questions Can I use the PythonVirtualenvOperator? The PythonVirtualenvOperator is not explicitly supported on Amazon MWAA, but you can create a custom plugin that uses the PythonVirtualenvOperator. For sample code, see the section called “Custom plugin to patch PythonVirtualenvOperator ”. How long does it take Amazon MWAA to recognize a new DAG file? DAGs are periodically synchronized from the Amazon S3 bucket to your environment. If you add a new DAG file, it takes about 300 seconds for Amazon MWAA to start using the new file. If you update an existing DAG, it takes Amazon MWAA about 30 seconds to recognize your updates. These values, 300 seconds for new DAGs, and 30 seconds for updates to existing DAGs, correspond to Apache Airflow configuration options dag_dir_list_interval, and min_file_process_interval respectively. Shared Amazon VPCs 476 Amazon Managed Workflows for Apache Airflow User Guide Why is my DAG file not picked up by Apache Airflow? The following are possible solutions for this issue: 1. Check that your execution role has sufficient permissions to your Amazon S3 bucket. To learn more, see Amazon MWAA execution role. 2. Check that the Amazon |
amazon-mwaa-user-guide-137 | amazon-mwaa-user-guide.pdf | 137 | If you update an existing DAG, it takes Amazon MWAA about 30 seconds to recognize your updates. These values, 300 seconds for new DAGs, and 30 seconds for updates to existing DAGs, correspond to Apache Airflow configuration options dag_dir_list_interval, and min_file_process_interval respectively. Shared Amazon VPCs 476 Amazon Managed Workflows for Apache Airflow User Guide Why is my DAG file not picked up by Apache Airflow? The following are possible solutions for this issue: 1. Check that your execution role has sufficient permissions to your Amazon S3 bucket. To learn more, see Amazon MWAA execution role. 2. Check that the Amazon S3 bucket has Block Public Access configured, and Versioning enabled. To learn more, see Create an Amazon S3 bucket for Amazon MWAA. 3. Verify the DAG file itself. For example, be sure that each DAG has a unique DAG ID. Can I remove a plugins.zip or requirements.txt from an environment? Currently, there is no way to remove a plugins.zip or requirements.txt from an environment once they’ve been added, but we're working on the issue. In the interim, a workaround is to point to an empty text or zip file, respectively. To learn more, see Deleting files on Amazon S3. Why don't I see my plugins in the Apache Airflow v2.0.2 Admin Plugins menu? For security reasons, the Apache Airflow Web server on Amazon MWAA has limited network egress, and does not install plugins nor Python dependencies directly on the Apache Airflow web server for version 2.0.2 environments. The plugin that's shown allows Amazon MWAA to authenticate your Apache Airflow users in AWS Identity and Access Management (IAM). To be able to install plugins and Python dependencies directly on the web server, we recommend creating a new environemnt with Apache Airflow v2.2 and above. Amazon MWAA installs Python dependencies and and custom plugins directly on the web server for Apache Airflow v2.2 and above. Can I use AWS Database Migration Service (DMS) Operators? Amazon MWAA supports DMS Operators. However, this operator cannot be used to perform actions on the Amazon Aurora PostgreSQL metadata database associated with an Amazon MWAA environment. Why is my DAG file not picked up by Apache Airflow? 477 Amazon Managed Workflows for Apache Airflow User Guide When I access the Airflow REST API using the AWS credentials, can I increase the throttling limit to more than 10 transactions per second (TPS)? Yes, you can. To increase the throttling limit, please contact AWS Customer Support. When I access the Airflow REST API using the AWS credentials, can I increase the throttling limit to more than 10 transactions per second (TPS)? 478 Amazon Managed Workflows for Apache Airflow User Guide Troubleshooting Amazon Managed Workflows for Apache Airflow This chapter describes common issues and errors you may encounter when using Apache Airflow on Amazon Managed Workflows for Apache Airflow and recommended steps to resolve these errors. Contents • Troubleshooting: DAGs, Operators, Connections, and other issues in Apache Airflow v2 • Connections • I can't connect to Secrets Manager • How do I configure secretsmanager:ResourceTag/<tag-key> secrets manager conditions or a resource restriction in my execution role policy? • I can't connect to Snowflake • I can't see my connection in the Airflow UI • Web server • I see a 5xx error accessing the web server • I see a 'The scheduler does not appear to be running' error • Tasks • I see my tasks stuck or not completing • CLI • I see a '503' error when triggering a DAG in the CLI • Why does the dags backfill Apache Airflow CLI command fail? Is there a workaround? • Operators • I received a PermissionError: [Errno 13] Permission denied error using the S3Transform operator • Troubleshooting: DAGs, Operators, Connections, and other issues in Apache Airflow v1 • Updating requirements.txt • Adding apache-airflow-providers-amazon causes my environment to fail • Broken DAG • I received a 'Broken DAG' error when using Amazon DynamoDB operators 479 Amazon Managed Workflows for Apache Airflow User Guide • I received 'Broken DAG: No module named psycopg2' error • I received a 'Broken DAG' error when using the Slack operators • I received various errors installing Google/GCP/BigQuery • I received 'Broken DAG: No module named Cython' error • Operators • I received an error using the BigQuery operator • Connections • I can't connect to Snowflake • I can't connect to Secrets Manager • I can't connect to my MySQL server on '<DB-identifier-name>.cluster- id.<region>.rds.amazonaws.com' • Web server • I'm using the BigQueryOperator and it's causing my web server to crash • I see a 5xx error accessing the web server • I see a 'The scheduler does not appear to be running' error • Tasks • I see my tasks stuck or not completing • CLI • I see a '503' error when triggering a DAG |
amazon-mwaa-user-guide-138 | amazon-mwaa-user-guide.pdf | 138 | error • Operators • I received an error using the BigQuery operator • Connections • I can't connect to Snowflake • I can't connect to Secrets Manager • I can't connect to my MySQL server on '<DB-identifier-name>.cluster- id.<region>.rds.amazonaws.com' • Web server • I'm using the BigQueryOperator and it's causing my web server to crash • I see a 5xx error accessing the web server • I see a 'The scheduler does not appear to be running' error • Tasks • I see my tasks stuck or not completing • CLI • I see a '503' error when triggering a DAG in the CLI • Troubleshooting: Creating and updating an Amazon MWAA environment • Updating requirements.txt • I specified a new version of my requirements.txt and it's taking more than 20 minutes to update my environment • Plugins • Does Amazon MWAA support implementing custom UI? • I am able to implement custom UI changes on the Amazon MWAA local runner via plugins, yet when I try to do the same on Amazon MWAA, I do not see my changes nor any errors. Why is this happening? • Create bucket • I can't select the option for S3 Block Public Access settings 480 • Create environment Amazon Managed Workflows for Apache Airflow User Guide • I tried to create an environment and it's stuck in the "Creating" state • I tried to create an environment but it shows the status as "Create failed" • I tried to select a VPC and received a "Network Failure" error • I tried to create an environment and received a service, partition, or resource "must be passed" error • I tried to create an environment and it shows the status as "Available" but when I try to access the Airflow UI an "Empty Reply from Server" or "502 Bad Gateway" error is shown • I tried to create an environment and my user name is a bunch of random character names • Update environment • I tried changing the environment class but the update failed • Access environment • I can't access the Apache Airflow UI • Troubleshooting: CloudWatch Logs and CloudTrail errors • Logs • I can't see my task logs, or I received a 'Reading remote log from Cloudwatch log_group' error • Tasks are failing without any logs • I see a 'ResourceAlreadyExistsException' error in CloudTrail • I see an 'Invalid request' error in CloudTrail • I see a 'Cannot locate a 64-bit Oracle Client library: "libclntsh.so: cannot open shared object file: No such file or directory' in Apache Airflow logs • I see psycopg2 'server closed the connection unexpectedly' in my Scheduler logs • I see 'Executor reports task instance %s finished (%s) although the task says its %s' in my DAG processing logs • I see 'Could not read remote logs from log_group: airflow-*{*environmentName}-Task log_stream:* {*DAG_ID}/*{*TASK_ID}/*{*time}/*{*n}.log.' in my task logs 481 Amazon Managed Workflows for Apache Airflow User Guide Troubleshooting: DAGs, Operators, Connections, and other issues in Apache Airflow v2 The topics on this page describe resolutions to Apache Airflow v2 Python dependencies, custom plugins, DAGs, Operators, Connections, tasks, and Web server issues you may encounter on an Amazon Managed Workflows for Apache Airflow environment. Contents • Connections • I can't connect to Secrets Manager • How do I configure secretsmanager:ResourceTag/<tag-key> secrets manager conditions or a resource restriction in my execution role policy? • I can't connect to Snowflake • I can't see my connection in the Airflow UI • Web server • I see a 5xx error accessing the web server • I see a 'The scheduler does not appear to be running' error • Tasks • I see my tasks stuck or not completing • CLI • I see a '503' error when triggering a DAG in the CLI • Why does the dags backfill Apache Airflow CLI command fail? Is there a workaround? • Operators • I received a PermissionError: [Errno 13] Permission denied error using the S3Transform operator Connections The following topic describes the errors you may receive when using an Apache Airflow connection, or using another AWS database. I can't connect to Secrets Manager We recommend the following steps: Apache Airflow v2 482 Amazon Managed Workflows for Apache Airflow User Guide 1. Learn how to create secret keys for your Apache Airflow connection and variables in the section called “Configuring Secrets Manager”. 2. 3. Learn how to use the secret key for an Apache Airflow variable (test-variable) in Using a secret key in AWS Secrets Manager for an Apache Airflow variable. Learn how to use the secret key for an Apache Airflow connection (myconn) in Using a secret key in AWS Secrets Manager for an Apache Airflow connection. How do I configure secretsmanager:ResourceTag/<tag-key> secrets manager conditions or a resource restriction in my execution role policy? Note |
amazon-mwaa-user-guide-139 | amazon-mwaa-user-guide.pdf | 139 | for Apache Airflow User Guide 1. Learn how to create secret keys for your Apache Airflow connection and variables in the section called “Configuring Secrets Manager”. 2. 3. Learn how to use the secret key for an Apache Airflow variable (test-variable) in Using a secret key in AWS Secrets Manager for an Apache Airflow variable. Learn how to use the secret key for an Apache Airflow connection (myconn) in Using a secret key in AWS Secrets Manager for an Apache Airflow connection. How do I configure secretsmanager:ResourceTag/<tag-key> secrets manager conditions or a resource restriction in my execution role policy? Note Applies to Apache Airflow version 2.0 and earlier. Currently, you cannot limit access to Secrets Manager secrets by using condition keys or other resource restrictions in your environment's execution role, due to a known issue in Apache Airflow. I can't connect to Snowflake We recommend the following steps: 1. Test your DAGs, custom plugins, and Python dependencies locally using the aws-mwaa-local- runner on GitHub. 2. Add the following entries to the requirements.txt for your environment. apache-airflow-providers-snowflake==1.3.0 3. Add the following imports to your DAG: from airflow.providers.snowflake.operators.snowflake import SnowflakeOperator Ensure the Apache Airflow connection object includes the following key-value pairs: 1. Conn Id: snowflake_conn 2. Conn Type: Snowflake Connections 483 Amazon Managed Workflows for Apache Airflow User Guide 3. Host: <my account>.<my region if not us-west-2>.snowflakecomputing.com 4. Schema: <my schema> 5. Login: <my user name> 6. Password: ******** 7. Port: <port, if any> 8. Extra: { "account": "<my account>", "warehouse": "<my warehouse>", "database": "<my database>", "region": "<my region if not using us-west-2 otherwise omit this line>" } For example: >>> import json >>> from airflow.models.connection import Connection >>> myconn = Connection( ... conn_id='snowflake_conn', ... conn_type='Snowflake', ... host='YOUR_ACCOUNT.YOUR_REGION.snowflakecomputing.com', ... schema='YOUR_SCHEMA' ... login='YOUR_USERNAME', ... password='YOUR_PASSWORD', ... port='YOUR_PORT' ... extra=json.dumps(dict(account='YOUR_ACCOUNT', warehouse='YOUR_WAREHOUSE', database='YOUR_DB_OPTION', region='YOUR_REGION')), ... ) I can't see my connection in the Airflow UI Apache Airflow provides connection templates in the Apache Airflow UI. It uses this to generate the connection URI string, regardless of the connection type. If a connection template is not available in the Apache Airflow UI, an alternate connection template can be used to generate a connection URI string, such as using the HTTP connection template. We recommend the following steps: Connections 484 Amazon Managed Workflows for Apache Airflow User Guide 1. View the connection types Amazon MWAA's providing in the Apache Airflow UI at Apache Airflow provider packages installed on Amazon MWAA environments. 2. View the commands to create an Apache Airflow connection in the CLI at Apache Airflow CLI command reference. 3. Learn how to use connection templates in the Apache Airflow UI interchangeably for connection types that aren't available in the Apache Airflow UI on Amazon MWAA at Overview of connection types. Web server The following topic describes the errors you may receive for your Apache Airflow Web server on Amazon MWAA. I see a 5xx error accessing the web server We recommend the following steps: 1. Check Apache Airflow configuration options. Verify that the key-value pairs you specified as an Apache Airflow configuration option, such as AWS Secrets Manager, were configured correctly. To learn more, see the section called “I can't connect to Secrets Manager”. 2. Check the requirements.txt. Verify the Airflow "extras" package and other libraries listed in your requirements.txt are compatible with your Apache Airflow version. 3. Explore ways to specify Python dependencies in a requirements.txt file, see Managing Python dependencies in requirements.txt. I see a 'The scheduler does not appear to be running' error If the scheduler doesn't appear to be running, or the last "heart beat" was received several hours ago, your DAGs may not appear in Apache Airflow, and new tasks will not be scheduled. We recommend the following steps: 1. Confirm that your VPC security group allows inbound access to port 5432. This port is needed to connect to the Amazon Aurora PostgreSQL metadata database for your environment. After this rule is added, give Amazon MWAA a few minutes, and the error should disappear. To learn more, see the section called “Security in your VPC”. Web server 485 Amazon Managed Workflows for Apache Airflow User Guide Note • The Aurora PostgreSQL metadatabase is part of the Amazon MWAA service architecture and is not visible in your AWS account. • Database-related errors are usually a symptom of scheduler failure and not the root cause. 2. If the scheduler is not running, it might be due to a number of factors such as dependency installation failures, or an overloaded scheduler. Confirm that your DAGs, plugins, and requirements are working correctly by viewing the corresponding log groups in CloudWatch Logs. To learn more, see Monitoring and metrics. Tasks The following topic describes the errors you may receive for Apache Airflow tasks in an environment. I see my tasks stuck or not completing If |
amazon-mwaa-user-guide-140 | amazon-mwaa-user-guide.pdf | 140 | and is not visible in your AWS account. • Database-related errors are usually a symptom of scheduler failure and not the root cause. 2. If the scheduler is not running, it might be due to a number of factors such as dependency installation failures, or an overloaded scheduler. Confirm that your DAGs, plugins, and requirements are working correctly by viewing the corresponding log groups in CloudWatch Logs. To learn more, see Monitoring and metrics. Tasks The following topic describes the errors you may receive for Apache Airflow tasks in an environment. I see my tasks stuck or not completing If your Apache Airflow tasks are "stuck" or not completing, we recommend the following steps: 1. There may be a large number of DAGs defined. Reduce the number of DAGs and perform an update of the environment (such as changing a log level) to force a reset. a. Airflow parses DAGs whether they are enabled or not. If you're using greater than 50% of your environment's capacity you may start overwhelming the Apache Airflow Scheduler. This leads to large Total Parse Time in CloudWatch Metrics or long DAG processing times in CloudWatch Logs. There are other ways to optimize Apache Airflow configurations which are outside the scope of this guide. b. To learn more about the best practices we recommend to tune the performance of your environment, see the section called “Performance tuning for Apache Airflow”. 2. There may be a large number of tasks in the queue. This often appears as a large—and growing—number of tasks in the "None" state, or as a large number in Queued Tasks and/or Tasks Pending in CloudWatch. This can occur for the following reasons: Tasks 486 Amazon Managed Workflows for Apache Airflow User Guide a. If there are more tasks to run than the environment has the capacity to run, and/or a large number of tasks that were queued before autoscaling has time to detect the tasks and deploy additional Workers. b. c. If there are more tasks to run than an environment has the capacity to run, we recommend reducing the number of tasks that your DAGs run concurrently, and/or increasing the minimum Apache Airflow Workers. If there are a large number of tasks that were queued before autoscaling has had time to detect and deploy additional workers, we recommend staggering task deployment and/or increasing the minimum Apache Airflow Workers. d. You can use the update-environment command in the AWS Command Line Interface (AWS CLI) to change the minimum or maximum number of Workers that run on your environment. aws mwaa update-environment --name MyEnvironmentName --min-workers 2 --max- workers 10 e. To learn more about the best practices we recommend to tune the performance of your environment, see the section called “Performance tuning for Apache Airflow”. 3. If your tasks are stuck in the "running" state, you can also clear the tasks or mark them as succeeded or failed. This allows the autoscaling component for your environment to scale down the number of workers running on your environment. The following image shows an example of a stranded task. • Choose the circle for the stranded task, and then select Clear (as shown). This allows Amazon MWAA to scale down workers; otherwise, Amazon MWAA can't determine which DAGs are enabled or disabled, and can't scale down, if there are still queued tasks. Tasks 487 Amazon Managed Workflows for Apache Airflow User Guide 4. Learn more about the Apache Airflow task lifecycle at Concepts in the Apache Airflow reference guide. CLI The following topic describes the errors you may receive when running Airflow CLI commands in the AWS Command Line Interface. I see a '503' error when triggering a DAG in the CLI The Airflow CLI runs on the Apache Airflow Web server, which has limited concurrency. Typically a maximum of 4 CLI commands can run simultaneously. CLI 488 Amazon Managed Workflows for Apache Airflow User Guide Why does the dags backfill Apache Airflow CLI command fail? Is there a workaround? Note The following applies only to Apache Airflow v2.0.2 environments. The backfill command, like other Apache Airflow CLI commands, parses all DAGs locally before any DAGs are processed, regardless of which DAG the CLI operation applies to. In Amazon MWAA environments using Apache Airflow v2.0.2, because plugins and requirements are not yet installed on the web server by the time the CLI command runs, the parsing operation fails, and the backfill operation is not invoked. If you did not have any requirements nor plugins in your environment, the backfill operation would succeed. In order to be able to run the backfill CLI command, we recommend invoking it in a bash operator. In a bash operator, backfill is initiated from the worker, allowing the DAGs to parse successfully as all necessary requirements and plguins are available |
amazon-mwaa-user-guide-141 | amazon-mwaa-user-guide.pdf | 141 | applies to. In Amazon MWAA environments using Apache Airflow v2.0.2, because plugins and requirements are not yet installed on the web server by the time the CLI command runs, the parsing operation fails, and the backfill operation is not invoked. If you did not have any requirements nor plugins in your environment, the backfill operation would succeed. In order to be able to run the backfill CLI command, we recommend invoking it in a bash operator. In a bash operator, backfill is initiated from the worker, allowing the DAGs to parse successfully as all necessary requirements and plguins are available and installed. The following example shows how you can create a DAG with a BashOperator to run backfill. from airflow import DAG from airflow.operators.bash_operator import BashOperator from airflow.utils.dates import days_ago with DAG(dag_id="backfill_dag", schedule_interval=None, catchup=False, start_date=days_ago(1)) as dag: cli_command = BashOperator( task_id="bash_command", bash_command="airflow dags backfill my_dag_id" ) Operators The following topic describes the errors you may receive when using Operators. I received a PermissionError: [Errno 13] Permission denied error using the S3Transform operator We recommend the following steps if you're trying to run a shell script with the S3Transform operator and you're receiving a PermissionError: [Errno 13] Permission denied Operators 489 Amazon Managed Workflows for Apache Airflow User Guide error. The following steps assume you have an existing plugins.zip file. If you're creating a new plugins.zip, see Installing custom plugins. 1. Test your DAGs, custom plugins, and Python dependencies locally using the aws-mwaa-local- runner on GitHub. 2. Create your "transform" script. #!/bin/bash cp $1 $2 3. (optional) macOS and Linux users may need to run the following command to ensure the script is executable. chmod 777 transform_test.sh 4. Add the script to your plugins.zip. zip plugins.zip transform_test.sh 5. 6. Follow the steps in Upload the plugins.zip to Amazon S3. Follow the steps in Specifying the plugins.zip version on the Amazon MWAA console. 7. Create the following DAG. from airflow import DAG from airflow.providers.amazon.aws.operators.s3_file_transform import S3FileTransformOperator from airflow.utils.dates import days_ago import os DAG_ID = os.path.basename(__file__).replace(".py", "") with DAG (dag_id=DAG_ID, schedule_interval=None, catchup=False, start_date=days_ago(1)) as dag: file_transform = S3FileTransformOperator( task_id='file_transform', transform_script='/usr/local/airflow/plugins/transform_test.sh', source_s3_key='s3://YOUR_S3_BUCKET/files/input.txt', dest_s3_key='s3://YOUR_S3_BUCKET/files/output.txt' ) 8. Follow the steps in Uploading DAG code to Amazon S3. Operators 490 Amazon Managed Workflows for Apache Airflow User Guide Troubleshooting: DAGs, Operators, Connections, and other issues in Apache Airflow v1 The topics on this page contains resolutions to Apache Airflow v1.10.12 Python dependencies, custom plugins, DAGs, Operators, Connections, tasks, and Web server issues you may encounter on an Amazon Managed Workflows for Apache Airflow environment. Contents • Updating requirements.txt • Adding apache-airflow-providers-amazon causes my environment to fail • Broken DAG • I received a 'Broken DAG' error when using Amazon DynamoDB operators • I received 'Broken DAG: No module named psycopg2' error • I received a 'Broken DAG' error when using the Slack operators • I received various errors installing Google/GCP/BigQuery • I received 'Broken DAG: No module named Cython' error • Operators • I received an error using the BigQuery operator • Connections • I can't connect to Snowflake • I can't connect to Secrets Manager • I can't connect to my MySQL server on '<DB-identifier-name>.cluster- id.<region>.rds.amazonaws.com' • Web server • I'm using the BigQueryOperator and it's causing my web server to crash • I see a 5xx error accessing the web server • I see a 'The scheduler does not appear to be running' error • Tasks • I see my tasks stuck or not completing • CLI Apache Airflow v1 • I see a '503' error when triggering a DAG in the CLI 491 Amazon Managed Workflows for Apache Airflow User Guide Updating requirements.txt The following topic describes the errors you may receive when updating your requirements.txt. Adding apache-airflow-providers-amazon causes my environment to fail apache-airflow-providers-xyz is only compatible with Apache Airflow v2. apache- airflow-backport-providers-xyz is compatible with Apache Airflow 1.10.12. Broken DAG The following topic describes the errors you may receive when running DAGs. I received a 'Broken DAG' error when using Amazon DynamoDB operators We recommend the following steps: 1. Test your DAGs, custom plugins, and Python dependencies locally using the aws-mwaa-local- runner on GitHub. 2. Add the following package to your requirements.txt. boto 3. Explore ways to specify Python dependencies in a requirements.txt file, see Managing Python dependencies in requirements.txt. I received 'Broken DAG: No module named psycopg2' error We recommend the following steps: 1. Test your DAGs, custom plugins, and Python dependencies locally using the aws-mwaa-local- runner on GitHub. 2. Add the following to your requirements.txt with your Apache Airflow version. For example: apache-airflow[postgres]==1.10.12 3. Explore ways to specify Python dependencies in a requirements.txt file, see Managing Python dependencies in requirements.txt. Updating requirements.txt 492 Amazon Managed Workflows for Apache Airflow User Guide I received a 'Broken DAG' error when using the Slack operators We recommend the following steps: 1. Test your DAGs, |
amazon-mwaa-user-guide-142 | amazon-mwaa-user-guide.pdf | 142 | file, see Managing Python dependencies in requirements.txt. I received 'Broken DAG: No module named psycopg2' error We recommend the following steps: 1. Test your DAGs, custom plugins, and Python dependencies locally using the aws-mwaa-local- runner on GitHub. 2. Add the following to your requirements.txt with your Apache Airflow version. For example: apache-airflow[postgres]==1.10.12 3. Explore ways to specify Python dependencies in a requirements.txt file, see Managing Python dependencies in requirements.txt. Updating requirements.txt 492 Amazon Managed Workflows for Apache Airflow User Guide I received a 'Broken DAG' error when using the Slack operators We recommend the following steps: 1. Test your DAGs, custom plugins, and Python dependencies locally using the aws-mwaa-local- runner on GitHub. 2. Add the following package to your requirements.txt and specify your Apache Airflow version. For example: apache-airflow[slack]==1.10.12 3. Explore ways to specify Python dependencies in a requirements.txt file, see Managing Python dependencies in requirements.txt. I received various errors installing Google/GCP/BigQuery Amazon MWAA uses Amazon Linux which requires a specific version of Cython and cryptograpy libraries. We recommend the following steps: 1. Test your DAGs, custom plugins, and Python dependencies locally using the aws-mwaa-local- runner on GitHub. 2. Add the following package to your requirements.txt. grpcio==1.27.2 cython==0.29.21 pandas-gbq==0.13.3 cryptography==3.3.2 apache-airflow-backport-providers-amazon[google] 3. If you’re not using backport providers, you can use: grpcio==1.27.2 cython==0.29.21 pandas-gbq==0.13.3 cryptography==3.3.2 apache-airflow[gcp]==1.10.12 4. Explore ways to specify Python dependencies in a requirements.txt file, see Managing Python dependencies in requirements.txt. Broken DAG 493 Amazon Managed Workflows for Apache Airflow User Guide I received 'Broken DAG: No module named Cython' error Amazon MWAA uses Amazon Linux which requires a specific version of Cython. We recommend the following steps: 1. Test your DAGs, custom plugins, and Python dependencies locally using the aws-mwaa-local- runner on GitHub. 2. Add the following package to your requirements.txt. cython==0.29.21 3. Cython libraries have various required pip dependency versions. For example, using awswrangler==2.4.0 requires pyarrow<3.1.0,>=2.0.0, so pip3 tries to install pyarrow==3.0.0 which causes a Broken DAG error. We recommend specifying the oldest acceptible version explicity. For example, if you specify the minimum value pyarrow==2.0.0 before awswrangler==2.4.0 then the error goes away, and the requirements.txt installs correctly. The final requirements should look like this: cython==0.29.21 pyarrow==2.0.0 awswrangler==2.4.0 4. Explore ways to specify Python dependencies in a requirements.txt file, see Managing Python dependencies in requirements.txt. Operators The following topic describes the errors you may receive when using Operators. I received an error using the BigQuery operator Amazon MWAA does not support operators with UI extensions. We recommend the following steps: 1. Test your DAGs, custom plugins, and Python dependencies locally using the aws-mwaa-local- runner on GitHub. 2. A workaround is to override the extension by adding a line in the DAG to set <operator name>.operator_extra_links = None after importing the problem operators. For example: Operators 494 Amazon Managed Workflows for Apache Airflow User Guide from airflow.contrib.operators.bigquery_operator import BigQueryOperator BigQueryOperator.operator_extra_links = None 3. You can use this approach for all DAGs by adding the above to a plugin. For an example, see the section called “Custom plugin to patch PythonVirtualenvOperator ”. Connections The following topic describes the errors you may receive when using an Apache Airflow connection, or using another AWS database. I can't connect to Snowflake We recommend the following steps: 1. Test your DAGs, custom plugins, and Python dependencies locally using the aws-mwaa-local- runner on GitHub. 2. Add the following entries to the requirements.txt for your environment. asn1crypto == 0.24.0 snowflake-connector-python == 1.7.2 3. Add the following imports to your DAG: from airflow.contrib.hooks.snowflake_hook import SnowflakeHook from airflow.contrib.operators.snowflake_operator import SnowflakeOperator Ensure the Apache Airflow connection object includes the following key-value pairs: 1. Conn Id: snowflake_conn 2. Conn Type: Snowflake 3. Host: <my account>.<my region if not us-west-2>.snowflakecomputing.com 4. Schema: <my schema> 5. Login: <my user name> 6. Password: ******** 7. Port: <port, if any> Connections 495 Amazon Managed Workflows for Apache Airflow User Guide 8. Extra: { "account": "<my account>", "warehouse": "<my warehouse>", "database": "<my database>", "region": "<my region if not using us-west-2 otherwise omit this line>" } For example: >>> import json >>> from airflow.models.connection import Connection >>> myconn = Connection( ... conn_id='snowflake_conn', ... conn_type='Snowflake', ... host='YOUR_ACCOUNT.YOUR_REGION.snowflakecomputing.com', ... schema='YOUR_SCHEMA' ... login='YOUR_USERNAME', ... password='YOUR_PASSWORD', ... port='YOUR_PORT' ... extra=json.dumps(dict(account='YOUR_ACCOUNT', warehouse='YOUR_WAREHOUSE', database='YOUR_DB_OPTION', region='YOUR_REGION')), ... ) I can't connect to Secrets Manager We recommend the following steps: 1. Learn how to create secret keys for your Apache Airflow connection and variables in the section called “Configuring Secrets Manager”. 2. 3. Learn how to use the secret key for an Apache Airflow variable (test-variable) in Using a secret key in AWS Secrets Manager for an Apache Airflow variable. Learn how to use the secret key for an Apache Airflow connection (myconn) in Using a secret key in AWS Secrets Manager for an Apache Airflow connection. Connections 496 Amazon Managed Workflows for Apache Airflow User Guide I can't connect to my MySQL |
amazon-mwaa-user-guide-143 | amazon-mwaa-user-guide.pdf | 143 | Secrets Manager We recommend the following steps: 1. Learn how to create secret keys for your Apache Airflow connection and variables in the section called “Configuring Secrets Manager”. 2. 3. Learn how to use the secret key for an Apache Airflow variable (test-variable) in Using a secret key in AWS Secrets Manager for an Apache Airflow variable. Learn how to use the secret key for an Apache Airflow connection (myconn) in Using a secret key in AWS Secrets Manager for an Apache Airflow connection. Connections 496 Amazon Managed Workflows for Apache Airflow User Guide I can't connect to my MySQL server on '<DB-identifier-name>.cluster- id.<region>.rds.amazonaws.com' Amazon MWAA's security group and the RDS security group need an ingress rule to allow traffic to and from one another. We recommend the following steps: 1. Modify the RDS security group to allow all traffic from Amazon MWAA's VPC security group. 2. Modify Amazon MWAA's VPC security group to allow all traffic from the RDS security group. 3. Rerun your tasks again and verify whether the SQL query succeeded by checking Apache Airflow logs in CloudWatch Logs. Web server The following topic describes the errors you may receive for your Apache Airflow Web server on Amazon MWAA. I'm using the BigQueryOperator and it's causing my web server to crash We recommend the following steps: 1. Apache Airflow operators such as the BigQueryOperator and QuboleOperator that contain operator_extra_links could cause your Apache Airflow web server to crash. These operators attempt to load code to your web server, which is not permitted for security reasons. We recommend patching the operators in your DAG by adding the following code after your import statements: BigQueryOperator.operator_extra_links = None 2. Test your DAGs, custom plugins, and Python dependencies locally using the aws-mwaa-local- runner on GitHub. I see a 5xx error accessing the web server We recommend the following steps: 1. Check Apache Airflow configuration options. Verify that the key-value pairs you specified as an Apache Airflow configuration option, such as AWS Secrets Manager, were configured correctly. To learn more, see the section called “I can't connect to Secrets Manager”. Web server 497 Amazon Managed Workflows for Apache Airflow User Guide 2. Check the requirements.txt. Verify the Airflow "extras" package and other libraries listed in your requirements.txt are compatible with your Apache Airflow version. 3. Explore ways to specify Python dependencies in a requirements.txt file, see Managing Python dependencies in requirements.txt. I see a 'The scheduler does not appear to be running' error If the scheduler doesn't appear to be running, or the last "heart beat" was received several hours ago, your DAGs may not appear in Apache Airflow, and new tasks will not be scheduled. We recommend the following steps: 1. Confirm that your VPC security group allows inbound access to port 5432. This port is needed to connect to the Amazon Aurora PostgreSQL metadata database for your environment. After this rule is added, give Amazon MWAA a few minutes, and the error should disappear. To learn more, see the section called “Security in your VPC”. Note • The Aurora PostgreSQL metadatabase is part of the Amazon MWAA service architecture and is not visible in your AWS account. • Database-related errors are usually a symptom of scheduler failure and not the root cause. 2. If the scheduler is not running, it might be due to a number of factors such as dependency installation failures, or an overloaded scheduler. Confirm that your DAGs, plugins, and requirements are working correctly by viewing the corresponding log groups in CloudWatch Logs. To learn more, see Monitoring and metrics. Tasks The following topic describes the errors you may receive for Apache Airflow tasks in an environment. I see my tasks stuck or not completing If your Apache Airflow tasks are "stuck" or not completing, we recommend the following steps: Tasks 498 Amazon Managed Workflows for Apache Airflow User Guide 1. There may be a large number of DAGs defined. Reduce the number of DAGs and perform an update of the environment (such as changing a log level) to force a reset. a. Airflow parses DAGs whether they are enabled or not. If you're using greater than 50% of your environment's capacity you may start overwhelming the Apache Airflow Scheduler. This leads to large Total Parse Time in CloudWatch Metrics or long DAG processing times in CloudWatch Logs. There are other ways to optimize Apache Airflow configurations which are outside the scope of this guide. b. To learn more about the best practices we recommend to tune the performance of your environment, see the section called “Performance tuning for Apache Airflow”. 2. There may be a large number of tasks in the queue. This often appears as a large—and growing—number of tasks in the "None" state, or as a large number in Queued Tasks and/or Tasks Pending |
amazon-mwaa-user-guide-144 | amazon-mwaa-user-guide.pdf | 144 | Apache Airflow Scheduler. This leads to large Total Parse Time in CloudWatch Metrics or long DAG processing times in CloudWatch Logs. There are other ways to optimize Apache Airflow configurations which are outside the scope of this guide. b. To learn more about the best practices we recommend to tune the performance of your environment, see the section called “Performance tuning for Apache Airflow”. 2. There may be a large number of tasks in the queue. This often appears as a large—and growing—number of tasks in the "None" state, or as a large number in Queued Tasks and/or Tasks Pending in CloudWatch. This can occur for the following reasons: a. If there are more tasks to run than the environment has the capacity to run, and/or a large number of tasks that were queued before autoscaling has time to detect the tasks and deploy additional Workers. b. c. If there are more tasks to run than an environment has the capacity to run, we recommend reducing the number of tasks that your DAGs run concurrently, and/or increasing the minimum Apache Airflow Workers. If there are a large number of tasks that were queued before autoscaling has had time to detect and deploy additional workers, we recommend staggering task deployment and/or increasing the minimum Apache Airflow Workers. d. You can use the update-environment command in the AWS Command Line Interface (AWS CLI) to change the minimum or maximum number of Workers that run on your environment. aws mwaa update-environment --name MyEnvironmentName --min-workers 2 --max- workers 10 e. To learn more about the best practices we recommend to tune the performance of your environment, see the section called “Performance tuning for Apache Airflow”. 3. If your tasks are stuck in the "running" state, you can also clear the tasks or mark them as succeeded or failed. This allows the autoscaling component for your environment to scale down the number of workers running on your environment. The following image shows an example of a stranded task. Tasks 499 Amazon Managed Workflows for Apache Airflow User Guide • Choose the circle for the stranded task, and then select Clear (as shown). This allows Amazon MWAA to scale down workers; otherwise, Amazon MWAA can't determine which DAGs are enabled or disabled, and can't scale down, if there are still queued tasks. 4. Learn more about the Apache Airflow task lifecycle at Concepts in the Apache Airflow reference guide. CLI The following topic describes the errors you may receive when running Airflow CLI commands in the AWS Command Line Interface. CLI 500 Amazon Managed Workflows for Apache Airflow User Guide I see a '503' error when triggering a DAG in the CLI The Airflow CLI runs on the Apache Airflow Web server, which has limited concurrency. Typically a maximum of 4 CLI commands can run simultaneously. Troubleshooting: Creating and updating an Amazon MWAA environment The topics on this page contains errors you may encounter when creating and updating an Amazon Managed Workflows for Apache Airflow environment and how to resolve these errors. Contents • Updating requirements.txt • I specified a new version of my requirements.txt and it's taking more than 20 minutes to update my environment • Plugins • Does Amazon MWAA support implementing custom UI? • I am able to implement custom UI changes on the Amazon MWAA local runner via plugins, yet when I try to do the same on Amazon MWAA, I do not see my changes nor any errors. Why is this happening? • Create bucket • I can't select the option for S3 Block Public Access settings • Create environment • I tried to create an environment and it's stuck in the "Creating" state • I tried to create an environment but it shows the status as "Create failed" • I tried to select a VPC and received a "Network Failure" error • I tried to create an environment and received a service, partition, or resource "must be passed" error • I tried to create an environment and it shows the status as "Available" but when I try to access the Airflow UI an "Empty Reply from Server" or "502 Bad Gateway" error is shown • I tried to create an environment and my user name is a bunch of random character names • Update environment • I tried changing the environment class but the update failed Amazon MWAA Create/Update 501 Amazon Managed Workflows for Apache Airflow User Guide • Access environment • I can't access the Apache Airflow UI Updating requirements.txt The following topic describes the errors you may receive when updating your requirements.txt. I specified a new version of my requirements.txt and it's taking more than 20 minutes to update my environment If it takes more than twenty minutes for your environment to install a new version of a |
amazon-mwaa-user-guide-145 | amazon-mwaa-user-guide.pdf | 145 | environment and my user name is a bunch of random character names • Update environment • I tried changing the environment class but the update failed Amazon MWAA Create/Update 501 Amazon Managed Workflows for Apache Airflow User Guide • Access environment • I can't access the Apache Airflow UI Updating requirements.txt The following topic describes the errors you may receive when updating your requirements.txt. I specified a new version of my requirements.txt and it's taking more than 20 minutes to update my environment If it takes more than twenty minutes for your environment to install a new version of a requirements.txt file, the environment update failed and Amazon MWAA is rolling back to the last stable version of the container image. 1. Check package versions. We recommend always specifying either a specific version (==) or a maximum version (<=) for the Python dependencies in your requirements.txt. 2. Check Apache Airflow logs. If you enabled Apache Airflow logs, verify your log groups were created successfully on the Logs groups page on the CloudWatch console. If you see blank logs, the most common reason is due to missing permissions in your execution role for CloudWatch or Amazon S3 where logs are written. To learn more, see Execution role. 3. Check Apache Airflow configuration options. If you're using Secrets Manager, verify that the key-value pairs you specified as an Apache Airflow configuration option were configured correctly. To learn more, see the section called “Configuring Secrets Manager”. 4. Check VPC network configuration. To learn more, see the section called “Environment stuck”. 5. Check execution role permissions. An execution role is an AWS Identity and Access Management (IAM) role with a permissions policy that grants Amazon MWAA permission to invoke the resources of other AWS services (such as Amazon S3, CloudWatch, Amazon SQS, Amazon ECR) on your behalf. Your Customer managed key or AWS owned key also needs to be permitted access. To learn more, see Execution role. 6. To run a troubleshooting script that checks the Amazon VPC network setup and configuration for your Amazon MWAA environment, see the Verify Environment script in AWS Support Tools on GitHub. Updating requirements.txt 502 Amazon Managed Workflows for Apache Airflow User Guide Plugins The following topic describes issues you may encounter when configuring or updating Apache Airflow plugins. Does Amazon MWAA support implementing custom UI? Starting with Apache Airflow v2.2.2, Amazon MWAA supports installing plugins on the Apache Airflow web server, and implementing custom UI. If your Amazon MWAA environment is running Apache Airflow v2.0.2 or older, you will not be able to implement custom UI. For more information about version management, and upgrading your existing environments, see Versions. I am able to implement custom UI changes on the Amazon MWAA local runner via plugins, yet when I try to do the same on Amazon MWAA, I do not see my changes nor any errors. Why is this happening? the Amazon MWAA local runner has all the Apache Airflow components bundled into one image, allowing you to apply custom UI plugin changes. Create bucket The following topic describes the errors you may receive when creating an Amazon S3 bucket. I can't select the option for S3 Block Public Access settings The execution role for your Amazon MWAA environment needs permission to the GetBucketPublicAccessBlock action on the Amazon S3 bucket to verify the bucket blocked public access. We recommend the following steps: 1. Follow the steps to Attach a JSON policy to your execution role. 2. Attach the following JSON policy: { "Effect":"Allow", "Action":[ "s3:GetObject*", "s3:GetBucket*", "s3:List*" ], Plugins 503 Amazon Managed Workflows for Apache Airflow User Guide "Resource":[ "arn:aws:s3:::YOUR_S3_BUCKET_NAME", "arn:aws:s3:::YOUR_S3_BUCKET_NAME/*" ] } Substitute the sample placeholders in YOUR_S3_BUCKET_NAME with your Amazon S3 bucket name, such as my-mwaa-unique-s3-bucket-name. 3. To run a troubleshooting script that checks the Amazon VPC network setup and configuration for your Amazon MWAA environment, see the Verify Environment script in AWS Support Tools on GitHub. Create environment The following topic describes the errors you may receive when creating an environment. I tried to create an environment and it's stuck in the "Creating" state We recommend the following steps: 1. Check VPC network with public routing. If you're using an Amazon VPC with Internet access, verify the following: • That your Amazon VPC is configured to allow network traffic between the different AWS resources used by your Amazon MWAA environment, as defined in the section called “About networking”. For example, your VPC security group must either allow all traffic in a self-referencing rule, or optionally specify the port range for HTTPS port range 443 and a TCP port range 5432. 2. Check VPC network with private routing. If you're using an Amazon VPC without Internet access, verify the following: • That your Amazon VPC is configured to allow network traffic between the different AWS resources |
amazon-mwaa-user-guide-146 | amazon-mwaa-user-guide.pdf | 146 | verify the following: • That your Amazon VPC is configured to allow network traffic between the different AWS resources used by your Amazon MWAA environment, as defined in the section called “About networking”. For example, your VPC security group must either allow all traffic in a self-referencing rule, or optionally specify the port range for HTTPS port range 443 and a TCP port range 5432. 2. Check VPC network with private routing. If you're using an Amazon VPC without Internet access, verify the following: • That your Amazon VPC is configured to allow network traffic between the different AWS resources for your Amazon MWAA environment, as defined in the section called “About networking”. For example, your two private subnets must not have a route table to a NAT gateway (or NAT instance), nor an Internet gateway. 3. To run a troubleshooting script that checks the Amazon VPC network setup and configuration for your Amazon MWAA environment, see the Verify Environment script in AWS Support Tools on GitHub. Create environment 504 Amazon Managed Workflows for Apache Airflow User Guide I tried to create an environment but it shows the status as "Create failed" We recommend the following steps: 1. Check VPC network configuration. To learn more, see the section called “Environment stuck”. 2. Check user permissions. Amazon MWAA performs a dry run against a user's credentials before creating an environment. Your AWS account may not have permission in AWS Identity and Access Management (IAM) to create some of the resources for an environment. For example, if you chose the Private network Apache Airflow access mode, your AWS account must have been granted access by your administrator to the AmazonMWAAFullConsoleAccess access control policy for your environment, which allows your account to create VPC endpoints. 3. Check execution role permissions. An execution role is an AWS Identity and Access Management (IAM) role with a permissions policy that grants Amazon MWAA permission to invoke the resources of other AWS services (such as Amazon S3, CloudWatch, Amazon SQS, Amazon ECR) on your behalf. Your Customer managed key or AWS owned key also needs to be permitted access. To learn more, see Execution role. 4. Check Apache Airflow logs. If you enabled Apache Airflow logs, verify your log groups were created successfully on the Logs groups page on the CloudWatch console. If you see blank logs, the most common reason is due to missing permissions in your execution role for CloudWatch or Amazon S3 where logs are written. To learn more, see Execution role. 5. To run a troubleshooting script that checks the Amazon VPC network setup and configuration for your Amazon MWAA environment, see the Verify Environment script in AWS Support Tools on GitHub. 6. If you are using an Amazon VPC without internet access, ensure that you've created an Amazon S3 gateway endpoint, and granted the minimum required permisions to Amazon ECR to access Amazon S3. To learn more about creating an Amazon S3 gateway endpoint, see the following: • Creating an Amazon VPC network without internet access • Create the Amazon S3 gateway endpoint in the Amazon Elastic Container Registry User Guide I tried to select a VPC and received a "Network Failure" error We recommend the following steps: • If you see a "Network Failure" error when you try to select an Amazon VPC when creating your environment, turn off any in-browser proxies that are running, and then try again. Create environment 505 Amazon Managed Workflows for Apache Airflow User Guide I tried to create an environment and received a service, partition, or resource "must be passed" error We recommend the following steps: • You may be receiving this error because the URI you specified for your Amazon S3 bucket includes a '/' at the end of the URI. We recommend removing the '/' in the path. The value should be in the following format: s3://your-bucket-name I tried to create an environment and it shows the status as "Available" but when I try to access the Airflow UI an "Empty Reply from Server" or "502 Bad Gateway" error is shown We recommend the following steps: 1. Check VPC security group configuration. To learn more, see the section called “Environment stuck”. 2. Confirm that any Apache Airflow packages you listed in the requirements.txt correspond to the Apache Airflow version you're running on Amazon MWAA. To learn more, see Installing Python dependencies. 3. To run a troubleshooting script that checks the Amazon VPC network setup and configuration for your Amazon MWAA environment, see the Verify Environment script in AWS Support Tools on GitHub. I tried to create an environment and my user name is a bunch of random character names • Apache Airflow has a maximum of 64 characters for user names. If your AWS Identity and Access Management (IAM) role exceeds this length, |
amazon-mwaa-user-guide-147 | amazon-mwaa-user-guide.pdf | 147 | that any Apache Airflow packages you listed in the requirements.txt correspond to the Apache Airflow version you're running on Amazon MWAA. To learn more, see Installing Python dependencies. 3. To run a troubleshooting script that checks the Amazon VPC network setup and configuration for your Amazon MWAA environment, see the Verify Environment script in AWS Support Tools on GitHub. I tried to create an environment and my user name is a bunch of random character names • Apache Airflow has a maximum of 64 characters for user names. If your AWS Identity and Access Management (IAM) role exceeds this length, a hash algorithm is used to reduce it, while remaining unique. Update environment The following topic describes the errors you may receive when updating an environment. Update environment 506 Amazon Managed Workflows for Apache Airflow User Guide I tried changing the environment class but the update failed If you update your environment to a different environment class (such as changing an mw1.medium to an mw1.small), and the request to update your environment failed, the environment status goes into an UPDATE_FAILED state and the environment is rolled back to, and is billed according to, the previous stable version of an environment. We recommend the following steps: 1. Test your DAGs, custom plugins, and Python dependencies locally using the aws-mwaa-local- runner on GitHub. 2. To run a troubleshooting script that checks the Amazon VPC network setup and configuration for your Amazon MWAA environment, see the Verify Environment script in AWS Support Tools on GitHub. Access environment The following topic describes the errors you may receive when accessing an environment. I can't access the Apache Airflow UI We recommend the following steps: 1. Check user permissions. You may not have been granted access to a permissions policy that allows you to view the Apache Airflow UI. To learn more, see the section called “Accessing an Amazon MWAA environment”. 2. Check network access. This may be because you selected the Private network access mode. If the URL of your Apache Airflow UI is in the following format 387fbcn-8dh4-9hfj-0dnd-834jhdfb-vpce.c10.us- west-2.airflow.amazonaws.com, it means that you're using private routing for your Apache Airflow Web server. You can either update the Apache Airflow access mode to the Public network access mode, or create a mechanism to access the VPC endpoint for your Apache Airflow Web server. To learn more, see the section called “Managing access to VPC endpoints”. Access environment 507 Amazon Managed Workflows for Apache Airflow User Guide Troubleshooting: CloudWatch Logs and CloudTrail errors The topics on this page contains resolutions to Amazon CloudWatch Logs and AWS CloudTrail errors you may encounter on an Amazon Managed Workflows for Apache Airflow environment. Contents • Logs • I can't see my task logs, or I received a 'Reading remote log from Cloudwatch log_group' error • Tasks are failing without any logs • I see a 'ResourceAlreadyExistsException' error in CloudTrail • I see an 'Invalid request' error in CloudTrail • I see a 'Cannot locate a 64-bit Oracle Client library: "libclntsh.so: cannot open shared object file: No such file or directory' in Apache Airflow logs • I see psycopg2 'server closed the connection unexpectedly' in my Scheduler logs • I see 'Executor reports task instance %s finished (%s) although the task says its %s' in my DAG processing logs • I see 'Could not read remote logs from log_group: airflow-*{*environmentName}-Task log_stream:* {*DAG_ID}/*{*TASK_ID}/*{*time}/*{*n}.log.' in my task logs Logs The following topic describes the errors you may receive when viewing Apache Airflow logs. I can't see my task logs, or I received a 'Reading remote log from Cloudwatch log_group' error Amazon MWAA has configured Apache Airflow to read and write logs directly from and to Amazon CloudWatch Logs. If a worker fails to start a task, or fails to write any logs, you will see the error: *** Reading remote log from Cloudwatch log_group: airflow-environmentName-Task log_stream: DAG_ID/TASK_ID/timestamp/n.log.Could not read remote logs from log_group: airflow-environmentName-Task log_stream: DAG_ID/TASK_ID/time/n.log. • We recommend the following steps: CloudWatch Logs and CloudTrail 508 Amazon Managed Workflows for Apache Airflow User Guide a. Verify that you have enabled task logs at the INFO level for your environment. For more information, see Viewing Airflow logs in Amazon CloudWatch. b. Verify that the environment execution role has the correct permission policies. c. Verify that your operator or task is working correctly, has sufficient resources to parse the DAG, and has the appropriate Python libraries to load. To verify your whether you have the correct dependencies, try eliminating imports until you find the one that is causing the issue. We recommend testing your Python dependencies using the Amazon MWAA local- runner tool. Tasks are failing without any logs If tasks are failing in a workflow and you can't locate any logs for the failed tasks, check if you are setting the queue |
amazon-mwaa-user-guide-148 | amazon-mwaa-user-guide.pdf | 148 | that the environment execution role has the correct permission policies. c. Verify that your operator or task is working correctly, has sufficient resources to parse the DAG, and has the appropriate Python libraries to load. To verify your whether you have the correct dependencies, try eliminating imports until you find the one that is causing the issue. We recommend testing your Python dependencies using the Amazon MWAA local- runner tool. Tasks are failing without any logs If tasks are failing in a workflow and you can't locate any logs for the failed tasks, check if you are setting the queue parameter in your default arguments, as shown in the following. from airflow import DAG from airflow.operators.bash_operator import BashOperator from airflow.utils.dates import days_ago # Setting queue argument to default. default_args = { "start_date": days_ago(1), "queue": "default" } with DAG(dag_id="any_command_dag", schedule_interval=None, catchup=False, default_args=default_args) as dag: cli_command = BashOperator( task_id="bash_command", bash_command="{{ dag_run.conf['command'] }}" ) To resovle the issue, remove queue from your code, and invoke the DAG again. I see a 'ResourceAlreadyExistsException' error in CloudTrail "errorCode": "ResourceAlreadyExistsException", "errorMessage": "The specified log stream already exists", "requestParameters": { "logGroupName": "airflow-MyAirflowEnvironment-DAGProcessing", Logs 509 Amazon Managed Workflows for Apache Airflow User Guide "logStreamName": "scheduler_cross-account-eks.py.log" } Certain Python requirements such as apache-airflow-backport-providers-amazon roll back the watchtower library that Amazon MWAA uses to communicate with CloudWatch to an older version. We recommend the following steps: • Add the following library to your requirements.txt watchtower==1.0.6 I see an 'Invalid request' error in CloudTrail Invalid request provided: Provided role does not have sufficient permissions for s3 location airflow-xxx-xxx/dags If you're creating an Amazon MWAA environment and an Amazon S3 bucket using the same AWS CloudFormation template, you need to add a DependsOn section within your AWS CloudFormation template. The two resources (MWAA Environment and MWAA Execution Policy) have a dependency in AWS CloudFormation. We recommend the following steps: • Add the following DependsOn statement to your AWS CloudFormation template. ... MaxWorkers: 5 NetworkConfiguration: SecurityGroupIds: - !GetAtt SecurityGroup.GroupId SubnetIds: !Ref subnetIds WebserverAccessMode: PUBLIC_ONLY DependsOn: MwaaExecutionPolicy MwaaExecutionPolicy: Type: AWS::IAM::ManagedPolicy Properties: Roles: - !Ref MwaaExecutionRole PolicyDocument: Version: 2012-10-17 Statement: Logs 510 Amazon Managed Workflows for Apache Airflow User Guide - Effect: Allow Action: airflow:PublishMetrics Resource: ... For an example, see Quick start tutorial for Amazon Managed Workflows for Apache Airflow. I see a 'Cannot locate a 64-bit Oracle Client library: "libclntsh.so: cannot open shared object file: No such file or directory' in Apache Airflow logs • We recommend the following steps: • If you're using Apache Airflow v2, add core.lazy_load_plugins : False as an Apache Airflow configuration option. To learn more, see Using configuration options to load plugins in 2. I see psycopg2 'server closed the connection unexpectedly' in my Scheduler logs If you see an error similar to the following, your Apache Airflow Scheduler may have run out of resources. 2021-06-14T10:20:24.581-05:00 sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) server closed the connection unexpectedly 2021-06-14T10:20:24.633-05:00 This probably means the server terminated abnormally 2021-06-14T10:20:24.686-05:00 before or while processing the request. We recommend the following steps: • Consider upgrading to Apache Airflow v2.0.2, which allows you to specify up to 5 Schedulers. I see 'Executor reports task instance %s finished (%s) although the task says its %s' in my DAG processing logs If you see an error similar to the following, your long-running tasks may have reached the task time limit on Amazon MWAA. Amazon MWAA has a limit of 12 hours for any one Airflow task, to prevent tasks from getting stuck in the queue and blocking activities like autoscaling. Logs 511 Amazon Managed Workflows for Apache Airflow User Guide Executor reports task instance %s finished (%s) although the task says its %s. (Info: %s) Was the task killed externally We recommend the following steps: • Consider breaking up the task into multiple, shorter running tasks. Airflow typically has a model whereby operators are asynchronous. It invokes activities on external systems, and Apache Airflow Sensors poll to see when its complete. If a Sensor fails, it can be safely retried without impacting the Operator's functionality. I see 'Could not read remote logs from log_group: airflow-*{*environmentName}- Task log_stream:* {*DAG_ID}/*{*TASK_ID}/*{*time}/*{*n}.log.' in my task logs If you see an error similar to the following, the execution role for your environment may not contain a permissions policy to create log streams for task logs. Could not read remote logs from log_group: airflow-*{*environmentName}-Task log_stream:* {*DAG_ID}/*{*TASK_ID}/*{*time}/*{*n}.log. We recommend the following steps: • Modify the execution role for your environment using one of the sample policies at the section called “Execution role”. You may have also specified a provider package in your requirements.txt file that is incompatible with your Apache Airflow version. For example, if you're using Apache Airflow v2.0.2, you may have specified a package, such as the apache-airflow-providers-databricks package, which is only compatible with Airflow 2.1+. We recommend the following steps: 1. If you're using |
amazon-mwaa-user-guide-149 | amazon-mwaa-user-guide.pdf | 149 | a permissions policy to create log streams for task logs. Could not read remote logs from log_group: airflow-*{*environmentName}-Task log_stream:* {*DAG_ID}/*{*TASK_ID}/*{*time}/*{*n}.log. We recommend the following steps: • Modify the execution role for your environment using one of the sample policies at the section called “Execution role”. You may have also specified a provider package in your requirements.txt file that is incompatible with your Apache Airflow version. For example, if you're using Apache Airflow v2.0.2, you may have specified a package, such as the apache-airflow-providers-databricks package, which is only compatible with Airflow 2.1+. We recommend the following steps: 1. If you're using Apache Airflow v2.0.2, modify the requirements.txt file and add apache- airflow[databricks]. This installs the correct version of the Databricks package that is compatible with Apache Airflow v2.0.2. 2. Test your DAGs, custom plugins, and Python dependencies locally using the aws-mwaa-local- runner on GitHub. Logs 512 Amazon Managed Workflows for Apache Airflow User Guide Amazon MWAA Document History The following table describes important additions to the Amazon MWAA service documentation, beginning in November 2020. To receive notifications about updates to this documentation, subscribe to the RSS feed. Change Description Date Added a new environment class: mw1.micro Amazon MWAA now provides a new environment class: November 19, 2024 mw1.micro. • the section called “Configur ing the environment class” • the section called “Performance tuning for Apache Airflow” Support for simpler method to access Apache Airflow Amazon MWAA now provides a simplified approach for October 23, 2024 REST API New Apache Airflow version interacting with the Apache Airflow REST API using AWS credentials. • the section called “Using the Apache Airflow REST API” • the section called “Apache Airflow Rest API access” Amazon MWAA now supports Apache Airflow v2.10.1. This update includes informati on on updated provider packages, and details about using Apache Airflow v2.10.1 on Amazon MWAA. September 26, 2024 513 Amazon Managed Workflows for Apache Airflow User Guide • Versions • the section called “Provider packages for Apache Airflow v2.10.1 connectio ns” New Apache Airflow version Amazon MWAA now supports Apache Airflow v2.9.2. This July 9, 2024 update includes informati on on updated provider packages, and details about using Apache Airflow v2.9.2 on Amazon MWAA. • Versions • the section called “Provider packages for Apache Airflow v2.9.2 connections” Amazon MWAA supports configuring a custom web Amazon MWAA supports configuring a custom web server domain names server domain names for June 18, 2024 private environments with no internet access. This update includes the following new topic that describes setting up a new custom domain. • the section called “Setting up a custom domain” 514 Amazon Managed Workflows for Apache Airflow User Guide Amazon MWAA supports web server automatic scaling and Amazon MWAA now supports automatic scaling of web May 16, 2024 the Apache Airflow REST API servers as well as the ability to access and use the Apache Airflow REST API. • the section called “Configur ing web server auto scaling” • the section called “Using the Apache Airflow REST API” Improved description of automatic scaling behavior Updated the following topic to reflect the new Amazon May 10, 2024 MWAA automatic scaling behavior when workers pick up new tasks as Fargate workers downscale. • the section called “Configur ing worker auto scaling” Support for larger instance sizes Amazon MWAA now supports two larger instance size April 16, 2024 options for larger workloads : mw1.xlarge , and mw1.2xlarge • the section called “Environment capabilities” 515 Amazon Managed Workflows for Apache Airflow User Guide New Apache Airflow version February 22, 2024 Amazon MWAA now supports Apache Airflow v2.8.1. This update includes informati on on updated provider packages, and details about using Apache Airflow v2.8.1 on Amazon MWAA. • Versions • the section called “Provider packages for Apache Airflow v2.8.1 connections” Support for shared Amazon VPC Amazon MWAA supports cross-account environment November 15, 2023 creation for organizations using Amazon OpenSearch Service to manage Amazon MWAA resources using a central shared Amazon VPC in an owner account. As part of this launch, Amazon MWAA lets you choose to create, and manage, your own Amazon VPC endpoints. • the section called “Managing your own Amazon VPC endpoints” 516 Amazon Managed Workflows for Apache Airflow User Guide New Apache Airflow version New Apache Airflow version November 6, 2023 August 9, 2023 Amazon MWAA now supports Apache Airflow v2.7.2. This update includes informati on on updated provider packages, and details about using Apache Airflow v2.7.2 on Amazon MWAA. • Versions • the section called “Provider packages for Apache Airflow v2.7.2 connections” Amazon MWAA now supports Apache Airflow v2.6.3. This update includes informati on on updated provider packages, and details about using Apache Airflow v2.6.3 on Amazon MWAA, • Versions • the section called “Provider packages for Apache Airflow v2.6.3 connections” Version deprecation informati on Updated topic on version deprecation |
amazon-mwaa-user-guide-150 | amazon-mwaa-user-guide.pdf | 150 | Apache Airflow version New Apache Airflow version November 6, 2023 August 9, 2023 Amazon MWAA now supports Apache Airflow v2.7.2. This update includes informati on on updated provider packages, and details about using Apache Airflow v2.7.2 on Amazon MWAA. • Versions • the section called “Provider packages for Apache Airflow v2.7.2 connections” Amazon MWAA now supports Apache Airflow v2.6.3. This update includes informati on on updated provider packages, and details about using Apache Airflow v2.6.3 on Amazon MWAA, • Versions • the section called “Provider packages for Apache Airflow v2.6.3 connections” Version deprecation informati on Updated topic on version deprecation to include July 31, 2023 deprecation notices and timelines for Apache Airflow v2.0.2 and Apache Airflow v2.2.2. • the section called “Apache Airflow deprecated versions” 517 Amazon Managed Workflows for Apache Airflow User Guide New topics and use cases Updated topic June 5, 2023 April 12, 2023 Amazon MWAA supports minor version upgrades. This updates includes the following new topic that describes how to upgrade the environment and make sure your workflow resources are compatible with the version of Apache Airflow you are upgrading to: • the section called “Upgrading the version” Updated customer managed IAM policies that grant a user full console and API access to Amazon MWAA. The update describes why you must provide permission for iam:PassRole in order to allow a user to pass roles to Amazon MWAA. Amazon MWAA uses these permissions to perform actions on a user's behalf. • the section called “Accessin g an Amazon MWAA environment” 518 Amazon Managed Workflows for Apache Airflow User Guide New guidance New Apache Airflow version April 12, 2023 April 11, 2023 Updated topic on configuri ng AWS Secrets Manager as a backend for Amazon MWAA to provide guidance on using lookup patterns. Using lookup patterns narrow the secrets that Apache Airflow searches for and reduce the number of API calls Amazon MWAA makes to Secrets Manager to retrieve connectio ns and variables. This reduces the costs associated with using Secrets Manager as a backend. • Create the Secrets Manager backend as an Apache Airflow configuration option Amazon MWAA now supports Apache Airflow v2.5.1. This update includes informati on on updated provider packages, and details about using Apache Airflow v2.5.1 on Amazon MWAA, • Versions • the section called “Provider packages for Apache Airflow v2.5.1 connections” 519 Amazon Managed Workflows for Apache Airflow User Guide New topics and use cases Added a new topic on using a startup script with an Amazon April 3, 2023 MWAA environment. This topic descibes configuring a startup script for an existing environment, using it to install Linux runtimes, and setting environment variables . • the section called “Using a startup script” Updated section on private web server access Updated the following topic on private web server access. February 24, 2023 The update clarifies that, in environments with private web server access, you must use a Python wheel archive (.whl) to package, and install, dependencies. • Private web server access mode 520 Amazon Managed Workflows for Apache Airflow User Guide Added information on deprecated Apache Airflow Updated the Versions topic with new information on how February 17, 2023 versions Amazon MWAA managed deprecating Apache Airflow versions. Removed a section about upgrading to newer version of Apache Airflow, and a section that described changes between Apache Airflow v1 and Apache Airflow v2. For more informati on about migrating to a newversion of Apache Airflow, see the Amazon MWAA Migration Guide. • the section called “Apache Airflow deprecated versions” • the section called “Apache Airflow version support and FAQ” 521 Amazon Managed Workflows for Apache Airflow User Guide Fixes in Amazon MWAA container metrics January 20, 2023 Updated the container metrics topic, and removed a set of erroneous metrics that did not exist under the Cluster dimension. Added an additional section that describes how you can evaluate the number of additional workers that an environment is utilizing at a given time by graphing the CPUUtilization or the MemoryUtilization metric for the Additiona lWorker component, and setting the statistics type to Sample Count. • the section called “Evaluati ng the number of additiona l worker and web server containers” 522 Amazon Managed Workflows for Apache Airflow User Guide New Apache Airflow version Amazon MWAA now supports Apache Airflow v2.4.3. This January 5, 2023 update includes informati on on updated provider packages, details about using Apache Airflow v2.4.3 on Amazon MWAA, and consolidated information about which features are supported in each Apache Airflow version on Amazon MWAA. • Versions • the section called “Provider packages for Apache Airflow v2.4.3 connections” Updated topic on service-l inked role Updated information about the service-linked role that November 18, 2022 Amazon MWAA uses to create and manage AWS resources on your behalf, including information about how you can delete the service-linked |
amazon-mwaa-user-guide-151 | amazon-mwaa-user-guide.pdf | 151 | User Guide New Apache Airflow version Amazon MWAA now supports Apache Airflow v2.4.3. This January 5, 2023 update includes informati on on updated provider packages, details about using Apache Airflow v2.4.3 on Amazon MWAA, and consolidated information about which features are supported in each Apache Airflow version on Amazon MWAA. • Versions • the section called “Provider packages for Apache Airflow v2.4.3 connections” Updated topic on service-l inked role Updated information about the service-linked role that November 18, 2022 Amazon MWAA uses to create and manage AWS resources on your behalf, including information about how you can delete the service-linked role when you no longer need it. This includes an updated service-linked role permissio n policy that allows Amazon MWAA to publishe additiona l CloudWatch metrics under the AWS/MWAA namespace. • the section called “Service- linked role” 523 Amazon Managed Workflows for Apache Airflow User Guide New topic on service metrics New topic Updated FAQ entry November 18, 2022 November 18, 2022 November 15, 2022 Added new topic that describes service metrics emitted by Amazon MWAA under the AWS/MWAA namespace. These include Amazon ECS cluster metrics schedulers, workers, and web servers, Amazon SQS metrics for the queues that allow Amazon MWAA to decouple schedulers and workers, as well as Amazon RDS metrics for the metadata database. • the section called “Containe r, queue, and database metrics” Added new guidance on modifying a constraints file to specify new versions of provider packages to use with your Amazon MWAA environment. • the section called “Specifyi ng newer provider packages” Updated information related to Amazon MWAA's HIPAA eligibility. • the section called “HIPAA compliance” 524 Amazon Managed Workflows for Apache Airflow User Guide New topic Added new topic on using October 21, 2022 New sample code New sample code aws:SourceArn and aws:SourceAccount global condition context keys in an Amazon MWAA execution role trust policy, in order to prevent cross-service confused deputy. • the section called “Cross- service confused deputy prevention” Added updated instructi ons and DAG code example that writes custom OS-level metrics to CloudWatch. • the section called “Using a DAG to write custom metrics” Added updated instructi ons and a new AWS Lambda Python code example that retrieves an Apache Airflow CLI token, then invokes a DAG in a specified Amazon MWAA environment. • the section called “Invoking DAGs with Lambda” September 13, 2022 September 12, 2022 525 Amazon Managed Workflows for Apache Airflow User Guide New architectural diagrams New sample code New sample code September 12, 2022 August 16, 2022 August 12, 2022 Added new architectural diagrams that demonstrate an Amazon MWAA environme nt with a public and private web server. • the section called “Apache Airflow access modes” Added updated instructi ons and a new DAG code example that retrieves an Apache Airflow CLI token, then invokes another DAG in a different Amazon MWAA environment. • the section called “Invoking DAGs in different environments” Added updated instructions and new DAG that queries an environment's Aurora PostgreSQL for metadata information, writes the result to CSV files and stores the files in Amazon S3. • the section called “Exportin g environment metadata to Amazon S3” 526 Amazon Managed Workflows for Apache Airflow User Guide New sample code New sample code New sample code New sample code August 3, 2022 July 26, 2022 July 15, 2022 June 17, 2022 Added updated instructions and new DAG that refreshes an AWS CodeArtifact token at runtime and stores the result in Amazon S3. • the section called “Refreshi ng an AWS CodeArtifact token at runtime” Added updated instructions and DAG code sample for using the ECSOperator in Amazon MWAA. • the section called “Using the ECSOperator ” Added updated instructions and DAG code sample for using the SSHOperator in Amazon MWAA. • the section called “Using the SSHOperator ” Added new instructions and DAG code sample for using dbt Postgres with Amazon MWAA. • the section called “Using dbt with Amazon MWAA” 527 Amazon Managed Workflows for Apache Airflow User Guide New topics and use cases New topics and use cases New guides May 13, 2022 April 19, 2022 March 7, 2022 Added new instructions and DAG code sample for installin g dependencies using Python wheel files for Amazon MWAA environments with public and private access. • Managing dependencies using Python wheels Added new guidance on choosing which Apache Airflow metrics Amazon MWAA sends to CloudWatch. • Choosing which Apache Airflow metrics are reported Amazon MWAA offers a migration guide for migrating Apache Airflow workflows from self-mana ged deployments, as well as existing Amazon MWAA environments. • Amazon MWAA Migration Guide 528 Amazon Managed Workflows for Apache Airflow User Guide New topics and use cases New sample code February 18, 2022 February 11, 2022 Added new security best practice for working with Apache Airflow, including |
amazon-mwaa-user-guide-152 | amazon-mwaa-user-guide.pdf | 152 | wheel files for Amazon MWAA environments with public and private access. • Managing dependencies using Python wheels Added new guidance on choosing which Apache Airflow metrics Amazon MWAA sends to CloudWatch. • Choosing which Apache Airflow metrics are reported Amazon MWAA offers a migration guide for migrating Apache Airflow workflows from self-mana ged deployments, as well as existing Amazon MWAA environments. • Amazon MWAA Migration Guide 528 Amazon Managed Workflows for Apache Airflow User Guide New topics and use cases New sample code February 18, 2022 February 11, 2022 Added new security best practice for working with Apache Airflow, including a solution for detecting changes to the Apache Airflow user privileges. • the section called “Security best practices in Apache Airflow” Added new code sample for creating timezone-aware DAGs using Pendulum, and clarified how to use a custom plugin to change the timezone in which Apache Airflow logs are created. • the section called “Changing a DAG's timezone” 529 Amazon Managed Workflows for Apache Airflow User Guide Apache Airflow v2.2.2 launch January 27, 2022 Amazon Managed Workflows for Apache Airflow now supports Apache Airflow v2.2.2. Beginning with v2.2, Amazon MWAA will install Python packages and custom plugins directly on the Apache Airflow web server allowing you greater flexibility to manage your environments. For more information, see the following. • Apache Airflow versions on Amazon Managed Workflows for Apache Airflow. • the section called “Provider packages for Apache Airflow v2.2.2 connections”. • Apache Airflow v2.2.2 changelog on the Apache Airflow documentation website. 530 Amazon Managed Workflows for Apache Airflow User Guide New tutorials Fixes December 8, 2021 November 22, 2021 Added a new tutorial that demonstrates creating a new custom Apache Airflow role, and assigning the role to an Apache Airflow user mapped from IAM in order to limit the user's access to a subset of specified DAGs. • the section called “Tutorial: Restricting users to a subset of DAGs” Fixed a best practices recommendation for setting the value of scheduler .min_file_process_ interval in order to optimize CPU usage. Added an IAM policy example granting access to Secrets Manager resources in the execution role. Added troubleshooting topic on using Secrets Manager condition keys. • Performance tuning how the scheduler parses DAGs • Provide Amazon MWAA with permission to access Secrets Manager secret keys • Configuring condition keys in the Amazon MWAA execution role for Secrets Manager 531 Amazon Managed Workflows for Apache Airflow User Guide New sample code Fixes November 1, 2021 October 26, 2021 Added the following new code sample for modifying the time zone in which DAGs are processed using a custom plugin, and new troublesh ooting topic for invoking the dags backfill Apache Airflow CLI command from within a bash operator. • the section called “Changing a DAG's timezone” • Backfill CLI command using a bash operator Fixed issues in the Amazon ECS operator code sample, and clarified the additional permissions required in the Amazon MWAA execution role to allow the environment to access Amazon ECS task log group in CloudWatch Logs. • Amazon ECS operator permissions. 532 Amazon Managed Workflows for Apache Airflow User Guide New sample code Fixes Now supported October 1, 2021 October 1, 2021 September 24, 2021 Added new code sample that queries the Aurora PostgreSQ L database for information relevant to DAG runs and writes the results to CSV file stored on Amazon S3. • the section called “Exportin g environment metadata to Amazon S3”. Corrected information about how Amazon MWAA automatically syncs new and changed objects from your target Amazon S3 bucket to your schedulers and workers. • How the DAG folder works. Amazon MWAA now supports additional provider packages for Apache Airflow 2.0+. To learn more about supported packages, see the following: • the section called “Provider packages for Apache Airflow v2.0.2 connections”. 533 Amazon Managed Workflows for Apache Airflow User Guide New commands and procedures September 24, 2021 Added additional guidance and AWS CLI command examples for creating an Amazon S3 gateway endpoint when using an Amazon VPC without internet access: • Creating an Amazon VPC network without Internet access. New topics and use cases Added the following changes: September 19, 2021 • Added a new code sample that uses an Amazon Elastic Container Service operator in the section called “Using the ECSOperator ”. • Added new troublesh ooting topics for issues in configuring Apache Airflow plugins in the section called “Plugins”. 534 Amazon Managed Workflows for Apache Airflow User Guide New supported region August 31, 2021 Amazon MWAA is now available in the following regions: • Asia Pacific (Mumbai) - ap- south-1 • Asia Pacific (Seoul) - ap- northeast-2 • Europe (London) - eu- west-2 • Europe (Paris) - eu-west-3 • Canada (Central) - ca-centra l-1 • South America (São Paulo) - sa-east-1 For more information about region availability |
amazon-mwaa-user-guide-153 | amazon-mwaa-user-guide.pdf | 153 | an Amazon Elastic Container Service operator in the section called “Using the ECSOperator ”. • Added new troublesh ooting topics for issues in configuring Apache Airflow plugins in the section called “Plugins”. 534 Amazon Managed Workflows for Apache Airflow User Guide New supported region August 31, 2021 Amazon MWAA is now available in the following regions: • Asia Pacific (Mumbai) - ap- south-1 • Asia Pacific (Seoul) - ap- northeast-2 • Europe (London) - eu- west-2 • Europe (Paris) - eu-west-3 • Canada (Central) - ca-centra l-1 • South America (São Paulo) - sa-east-1 For more information about region availability and service endpoints, see the following: • Amazon MWAA endpoints and quotas in the AWS General Reference. New topics and use cases Added the following changes: August 27, 2021 • Updated the sample policies to allow Amazon MWAA to fetch account-l evel Amazon S3 settings (s3:GetAccountPubli ) in cAccessBlock Amazon MWAA execution role. 535 Amazon Managed Workflows for Apache Airflow User Guide Fixes Added the following changes: August 27, 2021 • Fixed the AWS CloudForm ation template to use a self-referencing inbound rule for the security group in Create the VPC network. • Fixed the AWS CloudForm ation template to use a self-referencing inbound rule for the security group in Quick start tutorial for Amazon Managed Workflows for Apache Airflow. New topics and use cases Added the following changes: August 20, 2021 • Added DAG decorator to the list of what's supported for Apache Airflow v2.0.2 Apache Airflow versions on Amazon Managed Workflows for Apache Airflow. 536 Amazon Managed Workflows for Apache Airflow User Guide New topics and use cases Added the following changes: August 13, 2021 • Added celery.sy nc_parallelism use case to Performance tuning for Apache Airflow on Amazon MWAA. • Added service endpoints to quotas page and changed name to Amazon Managed Workflows for Apache Airflow service endpoints and quotas. • Clarified networking prerequisites based on user feedback at Get started with Amazon Managed Workflows for Apache Airflow. • Moved dags list-runs and dags next-exec ution to unsupported Airflow CLI commands in Apache Airflow CLI command reference. 537 Amazon Managed Workflows for Apache Airflow User Guide New sample code Added the following changes: August 13, 2021 • Added bash example to set, get or delete an Apache Airflow v2.0.2 variable in Apache Airflow CLI command reference. • Added Apache Airflow v2.0.2 dependencies and Airflow connection example to Using Amazon MWAA with Amazon RDS for Microsoft SQL Server. Fixes Added the following changes: August 13, 2021 • Fixed the Python code sample based on user feedback at Creating an SSH connection using the SSHOperator . 538 Amazon Managed Workflows for Apache Airflow User Guide New topics and use cases Added the following changes: August 6, 2021 • Moved variables set to supported Airflow CLI commands in Apache Airflow CLI command reference. • Added the summary of What's changed in v2.0.2 from the Airflow versions page to Installing Python dependencies based on user feedback. • Added the summary of What's changed in v2.0.2 from the Airflow versions page to Apache Airflow CLI command reference based on user feedback. • Added the summary of What's changed in v2.0.2 from the Airflow versions page to Overview of connection types based on user feedback. • Added the summary of What's changed in v2.0.2 from the Airflow versions page to Installing custom plugins based on user feedback. • Added the summary of What's changed in v2.0.2 from the Airflow versions page to Adding or 539 Amazon Managed Workflows for Apache Airflow User Guide updating DAGs based on user feedback. New sample code Added the following changes: August 6, 2021 • Added Apache Airflow v2.0.2 sample code to Using a DAG to import variables in the CLI. • Added Apache Airflow v2.0.2 sample code to Invoking DAGs with a Lambda function. New topics and use cases Added the following changes: July 29, 2021 • Added troubleshooting topic for 'I can't see my connection in the Airflow UI' at Troublesh ooting Amazon Managed Workflows for Apache Airflow. • Added a list of Amazon VPCs Amazon MWAA supports to About networking on Amazon MWAA. 540 Amazon Managed Workflows for Apache Airflow User Guide Fixes Added the following changes: July 29, 2021 • Fixed the Python code sample based on user feedback to print the web login token at Create a Apache Airflow web server access token. • Fixed the Snowflake connection topic based on user feedback to use a single quote for the warehouse parameter at Troubleshooting Amazon Managed Workflows for Apache Airflow. Removed or moved topics Added the following changes: July 23, 2021 • Restructed the existing page to include all monitoring and metrics documentation pages in Monitoring and metrics for Amazon Managed Workflows for Apache Airflow. • Moved Apache Airflow v2 environment metrics in |
amazon-mwaa-user-guide-154 | amazon-mwaa-user-guide.pdf | 154 | following changes: July 29, 2021 • Fixed the Python code sample based on user feedback to print the web login token at Create a Apache Airflow web server access token. • Fixed the Snowflake connection topic based on user feedback to use a single quote for the warehouse parameter at Troubleshooting Amazon Managed Workflows for Apache Airflow. Removed or moved topics Added the following changes: July 23, 2021 • Restructed the existing page to include all monitoring and metrics documentation pages in Monitoring and metrics for Amazon Managed Workflows for Apache Airflow. • Moved Apache Airflow v2 environment metrics in CloudWatch to the monitoring and metrics navigation menu. 541 Amazon Managed Workflows for Apache Airflow User Guide New guides Added the following changes: July 23, 2021 • Created Apache Airflow provider packages installed on Amazon MWAA environments. • Created Monitoring overview on Amazon MWAA. • Created Viewing audit logs in AWS CloudTrail. • Created Viewing Airflow logs in Amazon CloudWatc h. Fixes Added the following changes: July 23, 2021 • Fixed the Python code sample based on user feedback to generate an Airflow connection string in the correct sequence and added the port parameter in Configuring an Apache Airflow connection using a AWS Secrets Manager secret. • Added a step to install an unzip package locally based on user feedback in Creating a custom plugin with Oracle. 542 Amazon Managed Workflows for Apache Airflow User Guide New topics and use cases Added the following changes: July 16, 2021 • Added topic for AWS DMS Operators at Amazon MWAA frequently asked questions. • Added troubleshooting topic for a remote logs error to Troublesh ooting Amazon Managed Workflows for Apache Airflow. • Moved variables set to unsupported Airflow CLI commands in Apache Airflow CLI command reference. 543 Amazon Managed Workflows for Apache Airflow User Guide New topics and use cases Added the following changes: July 9, 2021 • Added sequential steps to create a requireme nts.txt file based on user feedback at Installing Python dependencies. • Added sequential steps to create a plugins.zip file based on user feedback at Installing custom plugins. • Added cross-reference links throughout the user guide to the API reference guide at Amazon Managed Workflows for Apache Airflow API Reference guide. • Added topic for why plugins aren't shown in the Airflow 2.0 Admin > Plugins menu at Amazon MWAA frequentl y asked questions. New guides Added the following changes: July 9, 2021 • Created Deleting files on Amazon S3. 544 Amazon Managed Workflows for Apache Airflow User Guide New topics and use cases Added the following changes: July 2, 2021 • Added a list of supported values at Using customer managed keys for encryptio n. • Updated and clarified the example for a private repo URL based on user feedback in Managing Python dependencies in requirements.txt. New sample code Added the following changes: July 2, 2021 • Added Apache Airflow v1.10.12 sample code to use a private key in AWS Secrets Manager for an SSH connection at Creating an SSH connection using the SSHOperator . New topics and use cases Added the following changes: June 25, 2021 • Added StartedTaskInstanc es and FinishedTaskInstan ces metrics to Apache Airflow v2 environment metrics in CloudWatch. New sample code Added the following changes: June 25, 2021 • Added Apache Airflow v2.0.2 sample code at Using Amazon MWAA with Amazon EKS. 545 Amazon Managed Workflows for Apache Airflow User Guide New guides Added the following changes: June 25, 2021 • Created Performance tuning for Apache Airflow on Amazon MWAA. 546 Amazon Managed Workflows for Apache Airflow User Guide New topics and use cases Added the following changes: June 18, 2021 • Added connections add and connections delete to the supported Apache Airflow v2.0.2 CLI commands at Apache Airflow CLI command reference. • Added that the latest version available in AWS CloudFormation is Apache Airflow v2.0.2 at Quick start tutorial for Amazon Managed Workflows for Apache Airflow. • Added question for storing temporary data on Apache Airflow Workers to Amazon MWAA frequently asked questions. • Added topic for the 'Executor reports task instance %s finished' error to Troubleshooting Amazon Managed Workflows for Apache Airflow. • Added topic for the 'server closed the connectio n unexpectedly' log to Troubleshooting Amazon Managed Workflows for Apache Airflow. • Added example to run CLI commands on an SSH 547 Amazon Managed Workflows for Apache Airflow User Guide tunnel to a bastion host to Creating an Apache Airflow CLI token. • Added topic for randomly- generated user names to Troubleshooting Amazon Managed Workflows for Apache Airflow. • Added topic for a 503 error when running a DAG in the CLI to Troublesh ooting Amazon Managed Workflows for Apache Airflow. • Added topic for custom plugins in Apache Airflow v2.0.2 which need an Airflow configuration option of core.lazy |
amazon-mwaa-user-guide-155 | amazon-mwaa-user-guide.pdf | 155 | connectio n unexpectedly' log to Troubleshooting Amazon Managed Workflows for Apache Airflow. • Added example to run CLI commands on an SSH 547 Amazon Managed Workflows for Apache Airflow User Guide tunnel to a bastion host to Creating an Apache Airflow CLI token. • Added topic for randomly- generated user names to Troubleshooting Amazon Managed Workflows for Apache Airflow. • Added topic for a 503 error when running a DAG in the CLI to Troublesh ooting Amazon Managed Workflows for Apache Airflow. • Added topic for custom plugins in Apache Airflow v2.0.2 which need an Airflow configuration option of core.lazy _load_plugins : False to load plugins at the start of each Airflow process to override the version's default setting to Using Apache Airflow configuration options on Amazon MWAA. • Added Airflow configura tion options step for Apache Airflow v2.0.2 plugins sample code at Creating a custom plugin with Apache Hive and Hadoop. 548 Amazon Managed Workflows for Apache Airflow User Guide • Added Airflow configura tion options step for Apache Airflow v2.0.2 plugins sample code at Creating a custom plugin that generates runtime environment variables. • Added Airflow configura tion options step for Apache Airflow v2.0.2 plugins sample code at Creating a custom plugin for Apache Airflow PythonVirtualenvOperator. • Added Airflow configura tion options step for Apache Airflow v2.0.2 plugins sample code at Creating a custom plugin with Oracle. New sample code Added the following changes: June 18, 2021 • Added sample code for an Apache Airflow Snowflake connection at Using a secret key in AWS Secrets Manager for an Apache Airflow Snowflake connection. 549 Amazon Managed Workflows for Apache Airflow User Guide New topics and use cases Added the following changes: June 2, 2021 • Added server-side encryptio n guidance to Create an Amazon S3 bucket for Amazon MWAA. • Added the secrets backend for Apache Airflow v2.0.2 to Configuring an Apache Airflow connection using a AWS Secrets Manager secret. • Added question for Apache Airflow Workers quota increase requests to Amazon MWAA frequently asked questions. • Added question for which metrics are used to determine whether to scale Apache Airflow Workers to Amazon MWAA frequently asked questions. • Added question for creating custom metrics in CloudWatch to Amazon MWAA frequently asked questions. • Added steps to enable private IP addresses for an Amazon S3 VPC interface endpoint for a VPC with private routing in Creating the required VPC service 550 Amazon Managed Workflows for Apache Airflow User Guide endpoints in an Amazon VPC with private routing. • Added an option to setup an SSH Tunnel using local port forwarding in Tutorial: Configuring private network access using a Linux Bastion Host. New sample code Added the following changes: June 2, 2021 • Added sample code for a DAG that queries the Amazon Aurora PostgreSQ L metadata database and publishes custom metrics to Amazon CloudWatc h at Using a DAG to write custom metrics in CloudWatch. New guides Added the following changes: June 2, 2021 • Created a guide on how to use connection templates interchangeably in the Apache Airflow UI in Overview of connection types. 551 Amazon Managed Workflows for Apache Airflow User Guide Fixes Added the following changes: June 2, 2021 • Added Apache Airflow VPC endpoints to the AWS CloudFormation template in Option three: Creating a VPC network without Internet access to Create the VPC network. 552 Amazon Managed Workflows for Apache Airflow User Guide Apache Airflow v2.0.2 launch General availability launch of Apache Airflow v2.0.2. May 26, 2021 • Created Apache Airflow versions on Amazon Managed Workflows for Apache Airflow. • Created Apache Airflow v2 environment metrics in CloudWatch. • Added version-specific links for Apache Airflow v2.0.2 to Using Apache Airflow configuration options on Amazon MWAA. • Added Apache Airflow v2.0.2 version-specific guidance to Installing Python dependencies. • Added Apache Airflow v2.0.2 version-specific guidance to Managing Python dependencies in requirements.txt. • Added Apache Airflow v2.0.2 sample plugins to Installing custom plugins. • Added Apache Airflow v2.0.2 sample code to Aurora PostgreSQL database cleanup on an Amazon MWAA environme nt. • Added Apache Airflow v2.0.2 sample code to Using 553 Amazon Managed Workflows for Apache Airflow User Guide a secret key in AWS Secrets Manager for an Apache Airflow connection. • Added Apache Airflow v2.0.2 sample code to Creating a custom plugin for Apache Airflow PythonVirtualenvOperator. • Added Apache Airflow v2.0.2 commands to Apache Airflow CLI command reference. • Added Apache Airflow v2.0.2 scripts to Creating an Apache Airflow CLI token. • Added a note that Amazon MWAA uses the latest Apache Airflow version by default to Create an Amazon MWAA environme nt. New topics and use cases Added the following changes: May 14, 2021 • Added guidance to troubleshooting Airflow tasks that are stuck or not running to Troublesh ooting Amazon Managed Workflows for Apache Airflow. |
amazon-mwaa-user-guide-156 | amazon-mwaa-user-guide.pdf | 156 | Apache Airflow connection. • Added Apache Airflow v2.0.2 sample code to Creating a custom plugin for Apache Airflow PythonVirtualenvOperator. • Added Apache Airflow v2.0.2 commands to Apache Airflow CLI command reference. • Added Apache Airflow v2.0.2 scripts to Creating an Apache Airflow CLI token. • Added a note that Amazon MWAA uses the latest Apache Airflow version by default to Create an Amazon MWAA environme nt. New topics and use cases Added the following changes: May 14, 2021 • Added guidance to troubleshooting Airflow tasks that are stuck or not running to Troublesh ooting Amazon Managed Workflows for Apache Airflow. 554 Amazon Managed Workflows for Apache Airflow User Guide Fixes Added the following changes: May 12, 2021 • We've updated the sample plugins code to use the latest Java version in Creating a custom plugin with Apache Hive and Hadoop. Previously, it was os.environ["JAVA_H OME"]="/usr/lib/jv m/jre-1.8.0-openjd k-1.8.0.272.b10-1. amzn2.0.1.x86_64" . Removed or moved topics Added the following changes: May 10, 2021 • Moved topics in Troublesh ooting Amazon Managed Workflows for Apache Airflow to new pages by category. New topics and use cases Added the following changes: May 10, 2021 • Added Amazon S3 bucket overview to Working with DAGs on Amazon MWAA. 555 Amazon Managed Workflows for Apache Airflow User Guide Removed or moved topics Added the following changes: May 7, 2021 • Moved Accessing Apache Airflow to the top-level navigation, and added pages for Create a Apache Airflow web server access token, Creating an Apache Airflow CLI token, and Apache Airflow CLI command reference. New topics and use cases Added the following changes: May 7, 2021 • Added version-specific links to the Apache Airflow reference guide for all supported and unsupport ed Airflow CLI commands in Apache Airflow CLI command reference. • Added version-specific links to the Apache Airflow reference guide for all configuration options in Using Apache Airflow configuration options on Amazon MWAA. • Added the Amazon MWAA CLI utility to Managing Python dependencies in requirements.txt. 556 Amazon Managed Workflows for Apache Airflow User Guide New topics and use cases Added the following changes: April 30, 2021 • Added flat and nested examples for how to structure a plugins.zip in Installing custom plugins. • Added the Amazon MWAA CLI utility to the Adding or updating DAGs, Installing custom plugins, and Installing Python dependencies pages. • Restructured content into an overview, upload to Amazon S3, and installing on Amazon MWAA sections based on user feedback in Installing custom plugins, and Installing Python dependencies pages. • Added an example use case to create and attach required VPC endpoints to an existing Amazon VPC without Internet access in About networking on Amazon MWAA. 557 Amazon Managed Workflows for Apache Airflow User Guide New sample code Added the following changes: April 30, 2021 • Added sample code that uses a secret key in Secrets Manager for an Apache Airflow variable in Using a secret key in AWS Secrets Manager for an Apache Airflow variable. New guides Added the following changes: April 30, 2021 • Created Creating the required VPC service endpoints in an Amazon VPC with private routing. Fixes Added the following changes: April 30, 2021 • Oops! We've updated core.default_ui_ti mezone to webserver .default_ui_timezo ne in Using Apache Airflow configuration options on Amazon MWAA. 558 Amazon Managed Workflows for Apache Airflow User Guide New topics and use cases Added the following changes: April 23, 2021 • Added Windows (PuTTY) steps for SSH tunnel to Tutorial: Configuring private network access using a Linux Bastion Host. • Added topic for apache- airflow-providers- amazon , which is only compatible with Apache Airflow 2.0 to Troublesh ooting Amazon Managed Workflows for Apache Airflow. New sample code Added the following changes: April 23, 2021 • Added sample code that uses a secret key in Secrets Manager for an Apache Airflow connection in Using a secret key in AWS Secrets Manager for an Apache Airflow connection. New guides Added the following changes: April 23, 2021 • Created About networking on Amazon MWAA. • Created Security in your VPC on Amazon MWAA. • Created Managing access to service-specific Amazon VPC endpoints on Amazon MWAA. 559 Amazon Managed Workflows for Apache Airflow User Guide New topics and use cases Added the following changes: April 16, 2021 • Added a new AWS CloudFormation template to create an Amazon VPC network without Internet access in Create the VPC network. • Added a new tutorial to create an AWS Client VPN in Tutorial: Configuring private network access using an AWS Client VPN. • Changed the name of the Networking access page to Apache Airflow access modes based on user feedback, and streamlin ed docs in Apache Airflow access modes. • Streamlined docs to include only Amazon VPC getting started information and templates based on user feedback in Create the VPC network. • |
amazon-mwaa-user-guide-157 | amazon-mwaa-user-guide.pdf | 157 | Added the following changes: April 16, 2021 • Added a new AWS CloudFormation template to create an Amazon VPC network without Internet access in Create the VPC network. • Added a new tutorial to create an AWS Client VPN in Tutorial: Configuring private network access using an AWS Client VPN. • Changed the name of the Networking access page to Apache Airflow access modes based on user feedback, and streamlin ed docs in Apache Airflow access modes. • Streamlined docs to include only Amazon VPC getting started information and templates based on user feedback in Create the VPC network. • Added BigQuery operator workaround to Troublesh ooting Amazon Managed Workflows for Apache Airflow. • Added an Apache Airflow v1.10.12 constraints file best practice to Installing Python dependencies. 560 Amazon Managed Workflows for Apache Airflow User Guide New sample code Added the following changes: April 16, 2021 • Added sample code to create a custom plugin using Oracle in Creating a custom plugin with Oracle. • Added sample code to create a custom plugin that generates runtime environment variables in Creating a custom plugin that generates runtime environment variables. • 561 Amazon Managed Workflows for Apache Airflow User Guide New topics and use cases Added the following changes: April 9, 2021 • Added topic for the self- referencing rule requireme nt on a VPC security group to Amazon MWAA frequentl y asked questions. • Added custom plugins directory and size limits to Installing custom plugins. • Added requirements directory and size limits to Installing Python dependencies. • Clarified the Apache Airflow configuration options for foo.user and foo.pass in Managing Python dependencies in requireme nts.txt. • Added configuration options overview to Using Apache Airflow configura tion options on Amazon MWAA. 562 Amazon Managed Workflows for Apache Airflow User Guide New sample code Added the following changes: April 9, 2021 • Added sample code to create a custom plugin using PythonVirtualenvOp erator in Creating a custom plugin for Apache Airflow PythonVirtualenvOperator. • Added sample code to create a custom plugin with Apache Hive and Hadoop in Creating a custom plugin with Apache Hive and Hadoop. Fixes Added the following changes: March 31, 2021 • Oops! We've updated the format for a requireme nts.txt, and added an example that's compatibl e with Apache Airflow v1.10.12 in Installing Python dependencies. 563 Amazon Managed Workflows for Apache Airflow User Guide New topics and use cases Added the following changes: March 26, 2021 • Added workaround to removing a requireme nts.txt or plugins.zip to Amazon MWAA frequently asked questions. • Added a bash workaround for SSH on an environment to Amazon MWAA frequentl y asked questions. • Added topic for CloudTrai l ResourceAlreadyExi stsException error to Troubleshooting Amazon Managed Workflows for Apache Airflow. 564 Amazon Managed Workflows for Apache Airflow User Guide New topics and use cases Added the following changes: March 19, 2021 • Added list of AWS services used to Amazon MWAA execution role. • Added list of AWS services used to Service-linked role for Amazon MWAA. • Added question for Python 3.7 version for Amazon MWAA to Amazon MWAA frequently asked questions. • Added question for PythonVirtualenvOperator to Amazon MWAA frequentl y asked questions. • Added the troubleshooting script as next steps for all topics related to VPC and environment configuration at Troubleshooting Amazon Managed Workflows for Apache Airflow. • Clarified the docs that a linux bastion must be in the same Region as an environment at Tutorial: Configuring private network access using a Linux Bastion Host. 565 Amazon Managed Workflows for Apache Airflow User Guide New guides Added the following changes: March 19, 2021 • Created Apache Airflow connections guide for AWS Secrets Manager at Configuring an Apache Airflow connection using a AWS Secrets Manager secret. • Created quick start tutorial using a AWS CloudForm ation template to create the Amazon VPC infrastru cture, Amazon S3 bucket, and Amazon MWAA environment at Quick start tutorial for Amazon Managed Workflows for Apache Airflow. New topics and use cases Added the following changes: March 12, 2021 • Added the create Amazon S3 bucket troublesh ooting topic Troublesh ooting Amazon Managed Workflows for Apache Airflow. • Added steps to create and attach a JSON policy to Amazon MWAA execution role. 566 Amazon Managed Workflows for Apache Airflow User Guide New sample code Added the following changes: March 12, 2021 • Added sample code to add a configuration when triggering a DAG to Accessing Apache Airflow. New guides Added the following changes: March 12, 2021 • Created best practices guide at Managing Python dependencies in requireme nts.txt. New topics and use cases Added the following changes: March 5, 2021 • Added Google/GCP/ BigQuery troublesh ooting topic to Troublesh ooting Amazon Managed Workflows for Apache Airflow. • Added Cython troublesh ooting topic to Troublesh ooting Amazon Managed Workflows |
amazon-mwaa-user-guide-158 | amazon-mwaa-user-guide.pdf | 158 | Amazon MWAA execution role. 566 Amazon Managed Workflows for Apache Airflow User Guide New sample code Added the following changes: March 12, 2021 • Added sample code to add a configuration when triggering a DAG to Accessing Apache Airflow. New guides Added the following changes: March 12, 2021 • Created best practices guide at Managing Python dependencies in requireme nts.txt. New topics and use cases Added the following changes: March 5, 2021 • Added Google/GCP/ BigQuery troublesh ooting topic to Troublesh ooting Amazon Managed Workflows for Apache Airflow. • Added Cython troublesh ooting topic to Troublesh ooting Amazon Managed Workflows for Apache Airflow. • Added MySQL troublesh ooting topic to Troublesh ooting Amazon Managed Workflows for Apache Airflow. • Added 5xx web server error troubleshooting topic to Troubleshooting Amazon Managed Workflows for Apache Airflow. 567 Amazon Managed Workflows for Apache Airflow User Guide Now supported Added the following changes: March 4, 2021 • Previously, backend_k wargs was not supported for AWS Secrets Manager and you needed a workaround to override the Secrets Manager function call. Now, backend_k wargs is supported. See the AWS Secrets Manager troubleshooting topic in Troubleshooting Amazon Managed Workflows for Apache Airflow. Fixes Added the following changes: March 4, 2021 • Oops! We've updated the size of each environment class to reflect the actual GB in Configuring the Amazon MWAA environme nt class. 568 Amazon Managed Workflows for Apache Airflow User Guide New topics and use cases Added the following changes: February 26, 2021 • Added private network access using a VPC endpoint policy to Apache Airflow access modes. • Added additional checks for the creating an environme nt troubleshooting topic to Troubleshooting Amazon Managed Workflows for Apache Airflow. • Added steps to view logs for requirements.txt to Installing Python dependencies. New topics and use cases Added the following changes: February 25, 2021 • Added Apache Hive use case to Installing Python dependencies. • Clarified the docs that the required dependencies for an Apache Airflow package needs to be included in the requirements.txt file at Installing Python dependencies. • Added Updating requireme nts.txt troubleshooting topic to Troublesh ooting Amazon Managed Workflows for Apache Airflow. 569 Amazon Managed Workflows for Apache Airflow User Guide New tutorials Added the following changes: February 22, 2021 • Added private network tutorial to Tutorial: Configuring private network access using a Linux Bastion Host. New topics and use cases Added the following changes: February 22, 2021 • Added private and public network configurations to Apache Airflow access modes. • Added development group use case and user scenarios to Amazon MWAA execution role. New sample code Added the following changes: February 22, 2021 • Added sample Python scripts for web login token and CLI token to Accessing Apache Airflow. • Added sample code to trigger DAG in another environment to Code examples for Amazon Managed Workflows for Apache Airflow. • Added sample code to trigger DAG using a Lambda function to Invoking DAGs with a Lambda function. 570 Amazon Managed Workflows for Apache Airflow User Guide New commands and procedures Added the following changes: February 22, 2021 • Added step by step procedures to all scripts at Accessing Apache Airflow. New sample code Added the following changes: February 17, 2021 • Updated curl example for web login token at Accessing Apache Airflow. • Added sample code to connect to an Amazon RDS Microsoft SQL Server to Using Amazon MWAA with Amazon RDS for Microsoft SQL Server. 571 Amazon Managed Workflows for Apache Airflow User Guide New commands and procedures Added the following changes: February 17, 2021 • Added AWS CLI commands to Working with DAGs on Amazon MWAA pages. • Apache Airflow doesn't support serialized DAGs in CLI commands. Since the CLI runs on the web server, which doesn't have plugins or requireme nts for security reasons, any MWAA environme nts with a plugins.z ip or requirements.txt will not support these commands. Moved Apache Airflow list_dags and backfill commands to unsupported commands at Accessing Apache Airflow. User guide docs are now open source on GitHub. Choose "Edit this page on GitHub" on any page. February 17, 2021 GitHub launch 572 Amazon Managed Workflows for Apache Airflow User Guide New topics and use cases Added the following changes: February 12, 2021 • Added question for Step Functions v. Amazon MWAA use case to Troublesh ooting Amazon Managed Workflows for Apache Airflow. • Added CLI access policy to Accessing an Amazon MWAA environment. • Clarified the docs that any supported Apache Airflow configuration option can be specified at Using Apache Airflow configuration options on Amazon MWAA. • Clarified the docs that if a Fargate container in one availability zone fails, MWAA switches to the other container in a different availability zone at Create the VPC network. New topics and use cases Added the following changes: February |
amazon-mwaa-user-guide-159 | amazon-mwaa-user-guide.pdf | 159 | the following changes: February 12, 2021 • Added question for Step Functions v. Amazon MWAA use case to Troublesh ooting Amazon Managed Workflows for Apache Airflow. • Added CLI access policy to Accessing an Amazon MWAA environment. • Clarified the docs that any supported Apache Airflow configuration option can be specified at Using Apache Airflow configuration options on Amazon MWAA. • Clarified the docs that if a Fargate container in one availability zone fails, MWAA switches to the other container in a different availability zone at Create the VPC network. New topics and use cases Added the following changes: February 5, 2021 • Added Configuring the Amazon MWAA environme nt class. 573 Amazon Managed Workflows for Apache Airflow User Guide Removed or moved topics Added the following changes: February 4, 2021 • Removed requirement for Amazon S3 bucket name to start with airflow- at Get started with Amazon Managed Workflows for Apache Airflow. • Moved Accessing an Amazon MWAA environme nt and Amazon MWAA execution role to Managing access to an Amazon MWAA environment. Amazon MWAA CloudForm ation Update the parameters to create an environment at February 4, 2021 Amazon MWAA CloudForm ation. • Remove SubnetList. • Remove TagList. • Add NetworkConfiguration. • Add TagMap. • Add create environment request examples. 574 Amazon Managed Workflows for Apache Airflow User Guide New topics and use cases Added the following changes: January 29, 2021 • Added example email configuration to Using Apache Airflow configura tion options on Amazon MWAA. • Added PostgresHook troubleshooting topic to Troubleshooting Amazon Managed Workflows for Apache Airflow. • Added AWS Secrets Manager troublesh ooting topic to Troublesh ooting Amazon Managed Workflows for Apache Airflow. • Added high performan ce use case to Configuri ng Amazon MWAA worker automatic scaling. General availability launch of Amazon Managed Workflows for Apache Airflow. • User guide documentation • AWS CloudFormation documentation Amazon MWAA launch November 24, 2020 575 |
amazon-quicksight-dg-001 | amazon-quicksight-dg.pdf | 1 | Developer Guide Amazon QuickSight Copyright © 2025 Amazon Web Services, Inc. and/or its affiliates. All rights reserved. Amazon QuickSight Developer Guide Amazon QuickSight: Developer Guide Copyright © 2025 Amazon Web Services, Inc. and/or its affiliates. All rights reserved. Amazon's trademarks and trade dress may not be used in connection with any product or service that is not Amazon's, in any manner that is likely to cause confusion among customers, or in any manner that disparages or discredits Amazon. All other trademarks not owned by Amazon are the property of their respective owners, who may or may not be affiliated with, connected to, or sponsored by Amazon. Amazon QuickSight Table of Contents Developer Guide Overview .......................................................................................................................................... 1 Are you a first-time Amazon QuickSight user? ...................................................................................... 1 Terminology and concepts .............................................................................................................. 2 Get started ....................................................................................................................................... 4 Prerequisites .................................................................................................................................................. 4 Make API requests with the QuickSight SDKs ..................................................................................... 12 Use CLI skeleton files ................................................................................................................................ 14 Generate a CLI skeleton file ............................................................................................................... 14 Operations that skeleton files are most useful for ........................................................................ 15 Use the QuickSight Dev Portal ............................................................................................................... 18 Embed assets with QuickSight ..................................................................................................... 20 Improve your Business Intelligence with QuickSight Embedded Analytics .................................... 20 How Embedded Analytics Can Transform Your Application ............................................................. 20 Considerations ............................................................................................................................................. 22 Get started ................................................................................................................................................... 23 Prerequisites ........................................................................................................................................... 23 Choose the right embedding solution ............................................................................................. 24 Create your first embedding application ......................................................................................... 26 Customize embedded assets ................................................................................................................... 27 Control the look and feel of embedded assets .............................................................................. 28 Add interactivity to your embedded content ................................................................................. 28 Personalization ...................................................................................................................................... 28 Embedding security ................................................................................................................................... 29 QuickSight manages who sees content ........................................................................................... 29 QuickSight manages where you see content .................................................................................. 30 QuickSight manages what you see ................................................................................................... 30 ARNs in QuickSight ....................................................................................................................... 31 ARN formats ................................................................................................................................................ 31 QuickSight resource ARNs ........................................................................................................................ 34 Permissions .................................................................................................................................................. 35 Best practices ......................................................................................................................................... 36 Errors ............................................................................................................................................................. 36 Common client errors .......................................................................................................................... 36 Client errors ........................................................................................................................................... 39 iii Amazon QuickSight Developer Guide Server errors .......................................................................................................................................... 42 Operations ...................................................................................................................................... 43 Account customization operations ......................................................................................................... 43 Account settings .................................................................................................................................... 44 CreateAccountCustomization .............................................................................................................. 45 DeleteAccountCustomization .............................................................................................................. 45 DescribeAccountCustomization .......................................................................................................... 46 UpdateAccountCustomization ............................................................................................................ 46 Analysis operations .................................................................................................................................... 47 Analysis permissions operations ........................................................................................................ 47 CreateAnalysis ........................................................................................................................................ 49 DeleteAnalysis ........................................................................................................................................ 49 DescribeAnalysis .................................................................................................................................... 50 ListAnalyses ............................................................................................................................................ 50 RestoreAnalysis ...................................................................................................................................... 51 SearchAnalyses ...................................................................................................................................... 51 UpdateAnalysis ...................................................................................................................................... 52 Asset bundle operations ........................................................................................................................... 53 Permissions ............................................................................................................................................. 54 Asset bundle export operations ........................................................................................................ 56 Asset bundle import operations ........................................................................................................ 61 Dashboard operations ............................................................................................................................... 71 Dashboard permissions ........................................................................................................................ 72 CreateDashboard ................................................................................................................................... 74 DeleteDashboard ................................................................................................................................... 74 DescribeDashboard ............................................................................................................................... 75 ListDashboards ...................................................................................................................................... 75 ListDashboardVersions ......................................................................................................................... 76 SearchDashboards ................................................................................................................................. 76 UpdateDashboard ................................................................................................................................. 77 UpdateDashboardPublishedVersion .................................................................................................. 78 Data source operations ............................................................................................................................. 78 Data source permissions ...................................................................................................................... 79 CreateDataSource .................................................................................................................................. 81 DeleteDataSource .................................................................................................................................. 81 DescribeDataSource .............................................................................................................................. 82 iv Amazon QuickSight Developer Guide ListDataSources ..................................................................................................................................... 82 UpdateDataSource ................................................................................................................................ 83 Dataset operations ..................................................................................................................................... 83 Dataset permissions operations ......................................................................................................... 84 CreateDataSet ........................................................................................................................................ 85 DeleteDataSet ........................................................................................................................................ 86 DescribeDataSet .................................................................................................................................... 86 ListDataSets ............................................................................................................................................ 87 UpdateDataSet ...................................................................................................................................... 87 Folder operations ....................................................................................................................................... 88 Folder membership operations .......................................................................................................... 89 Folder permissions operations ........................................................................................................... 91 CreateFolder ........................................................................................................................................... 93 DeleteFolder ........................................................................................................................................... 93 DescribeFolder ....................................................................................................................................... 94 ListFolders ............................................................................................................................................... 94 SearchFolders ......................................................................................................................................... 95 UpdateFolder ......................................................................................................................................... 95 Group operations ....................................................................................................................................... 96 Group membership operations .......................................................................................................... 96 CreateGroup ........................................................................................................................................... 99 DeleteGroup ........................................................................................................................................... 99 DescribeGroup ..................................................................................................................................... 100 ListGroups ............................................................................................................................................ 100 SearchGroups ....................................................................................................................................... 101 UpdateGroup ....................................................................................................................................... 101 IAM policy assignment operations ....................................................................................................... 102 CreateIAMPolicyAssignment ............................................................................................................. 102 DeleteIAMPolicyAssignment ............................................................................................................. 103 DescribeIAMPolicyAssignment ......................................................................................................... 103 ListIAMPolicyAssignments ................................................................................................................. 104 ListIAMPolicyAssignmentsForUser ................................................................................................... 104 UpdateIAMPolicyAssignment ........................................................................................................... 105 Ingestion operations ............................................................................................................................... 106 CancelIngestion ................................................................................................................................... 106 CreateIngestion ................................................................................................................................... 106 v Amazon QuickSight Developer Guide DescribeIngestion ................................................................................................................................ 107 ListIngestions ....................................................................................................................................... 108 IP and VPC endpoint restriction operations ....................................................................................... 108 DescribeIpRestriction ......................................................................................................................... 108 UpdateIpRestriction ............................................................................................................................ 109 VPC endpoints (AWS PrivateLink) ................................................................................................... 109 Key management operations ................................................................................................................ 113 Examples ............................................................................................................................................... 114 Namespace operations ............................................................................................................................ 115 CreateNamespace ............................................................................................................................... 115 DeleteNamespace ............................................................................................................................... 116 DescribeNamespace ............................................................................................................................ 116 ListNamespaces ................................................................................................................................... 117 Tag operations .......................................................................................................................................... 117 ListTagsForResource ........................................................................................................................... 118 TagResource ......................................................................................................................................... 118 UntagResource .................................................................................................................................... 119 Template alias operations ...................................................................................................................... 119 CreateTemplateAlias .......................................................................................................................... 120 DeleteTemplateAlias .......................................................................................................................... 120 DescribeTemplateAlias ....................................................................................................................... 121 ListTemplateAliases ............................................................................................................................ 121 UpdateTemplateAlias ......................................................................................................................... 122 Template operations ............................................................................................................................... 123 Template permissions ........................................................................................................................ 123 CreateTemplate ................................................................................................................................... 125 DeleteTemplate ................................................................................................................................... 125 DescribeTemplate ............................................................................................................................... 126 ListTemplates ....................................................................................................................................... 126 ListTemplateVersions ......................................................................................................................... 127 UpdateTemplate ................................................................................................................................. 127 Theme operations .................................................................................................................................... 128 Theme permissions ............................................................................................................................ 128 CreateTheme ........................................................................................................................................ 130 DeleteTheme ........................................................................................................................................ 131 DescribeTheme .................................................................................................................................... 131 vi Amazon QuickSight Developer Guide ListThemes ........................................................................................................................................... 132 ListThemeVersions .............................................................................................................................. 132 UpdateTheme ...................................................................................................................................... 133 Theme alias operations .......................................................................................................................... 133 CreateThemeAlias ............................................................................................................................... 134 DeleteThemeAlias ............................................................................................................................... 134 DescribeThemeAlias ........................................................................................................................... 135 ListThemeAliases ................................................................................................................................. 135 UpdateThemeAlias ............................................................................................................................. 136 User operations ........................................................................................................................................ 136 DeleteUser ............................................................................................................................................ 137 DeleteUserByPrincipalTitle ................................................................................................................ 137 DescribeUser ........................................................................................................................................ 138 ListUserGroups .................................................................................................................................... 138 ListUsers ................................................................................................................................................ 139 RegisterUser ......................................................................................................................................... 139 UpdateUser .......................................................................................................................................... 140 Document history ........................................................................................................................ 142 vii Amazon QuickSight Overview Developer Guide QuickSight is a cloud-scale business intelligence (BI) service that you can use to deliver easy-to- understand insights to the people who you work with, wherever they are. The |
amazon-quicksight-dg-002 | amazon-quicksight-dg.pdf | 2 | DescribeTheme .................................................................................................................................... 131 vi Amazon QuickSight Developer Guide ListThemes ........................................................................................................................................... 132 ListThemeVersions .............................................................................................................................. 132 UpdateTheme ...................................................................................................................................... 133 Theme alias operations .......................................................................................................................... 133 CreateThemeAlias ............................................................................................................................... 134 DeleteThemeAlias ............................................................................................................................... 134 DescribeThemeAlias ........................................................................................................................... 135 ListThemeAliases ................................................................................................................................. 135 UpdateThemeAlias ............................................................................................................................. 136 User operations ........................................................................................................................................ 136 DeleteUser ............................................................................................................................................ 137 DeleteUserByPrincipalTitle ................................................................................................................ 137 DescribeUser ........................................................................................................................................ 138 ListUserGroups .................................................................................................................................... 138 ListUsers ................................................................................................................................................ 139 RegisterUser ......................................................................................................................................... 139 UpdateUser .......................................................................................................................................... 140 Document history ........................................................................................................................ 142 vii Amazon QuickSight Overview Developer Guide QuickSight is a cloud-scale business intelligence (BI) service that you can use to deliver easy-to- understand insights to the people who you work with, wherever they are. The Amazon QuickSight Developer Guide provides usage examples of API operations for Amazon QuickSight and procedural walkthroughs of common tasks. The guide also provides examples showing how to work with QuickSight using AWS software development kits (SDKs). By using AWS SDKs, you can access Amazon QuickSight from your preferred programming language. Currently, you can use the Amazon QuickSight API to manage users and groups. In Enterprise Edition, you can also use the API to embed dashboards in your webpage or app. To monitor the calls made to the Amazon QuickSight API for your account, use AWS CloudTrail. CloudTrail can monitor calls made by the AWS Management Console, command line tools, and other services. For more information, see the AWS CloudTrail User Guide. Following, you can find out how to get started using the Amazon QuickSight API: • Terminology and concepts • Get started with the Amazon QuickSight API • Amazon Resource Names (ARNs) in QuickSight • Operations Are you a first-time Amazon QuickSight user? If you are a first-time user of Amazon QuickSight, we recommend that you begin by reading the following sections in the QuickSight User Guide: • How QuickSight Works • Getting Started with Data Analysis in QuickSight • AWS Security in QuickSight Are you a first-time Amazon QuickSight user? 1 Amazon QuickSight Developer Guide Terminology and concepts Following, you can find a list of terms and concepts used to describe Amazon QuickSight development in the Amazon QuickSight Developer Guide. Anonymous QuickSight user – A temporary QuickSight user identity that virtually belongs to a namespace and that you can use only with embedding. You can use tag-based rules to implement row-level security for such users. Caller identity – The identity of the IAM user making an API request. The identity of the caller is determined by QuickSight using the signature attached to the request. Through the use of our provided SDK clients, no manual steps are necessary to generate the signature or attach it to the requests. However, you can do it manually if you want to. Invoker identity – In addition to the caller identity, but not as a replacement for it, you can assume a caller's identity through the IAM AssumeRole API operation when making calls to QuickSight. AWS approves callers through their invoker’s identity. This approval means that you can avoid having to explicitly add multiple accounts belonging to the same QuickSight subscription. Namespace – A logical container that you can use to isolate user pools so that you can organize clients, subsidiaries, teams, and so on. QuickSight ARN – Amazon Resource Name (ARN). QuickSight resources are identified using their name or ARN. For example, the following are ARNs for a group named MyGroup1, a user named User1, and a dashboard with the ID 1a1ac2b2-3fc3-4b44-5e5d-c6db6778df89. arn:aws:quicksight:us-east-1:111122223333:group/default/MyGroup1 arn:aws:quicksight:us-east-1:111122223333:user/default/User1 arn:aws:quicksight:us-west-2:111122223333:dashboard/1a1ac2b2-3fc3-4b44-5e5d- c6db6778df89 The following examples show ARNs for a template named MyTemplate and a dashboard named MyDashboard. • The following is the sample ARN for a template. arn:aws:quicksight:us-east-1:111122223333:template/MyTemplate • The following is the sample ARN for a template, referencing a specific version of the template. 2 Amazon QuickSight Developer Guide arn:aws:quicksight:us-east-1:111122223333:template/MyTemplate/version/10 • The following is the sample ARN for a template alias. arn:aws:quicksight:us-east-1:111122223333:template/MyTemplate/alias/STAGING • The following is the sample ARN for a dashboard. arn:aws:quicksight:us-east-1:111122223333:dashboard/MyDashboard • The following is the sample ARN for a dashboard, referencing a specific version of the dashboard. arn:aws:quicksight:us-east-1:111122223333:dashboard/MyDashboard/version/10 Depending on the scenario, you might need to provide an entity's name, ID, or ARN. You can retrieve the ARN if you have the name, using some of the QuickSight API operations. QuickSight dashboard – An entity that identifies QuickSight reports, created from analyses or templates. You can share QuickSight dashboards. With the right permissions, you can create scheduled email reports from them. The CreateDashboard and DescribeDashboard API operations act on the dashboard entity. QuickSight template – An entity that encapsulates the metadata required to create an analysis or a dashboard. It abstracts the dataset associated with the analysis by replacing it with placeholders. You can use templates to create dashboards by replacing dataset placeholders with datasets. These datasets need to follow the same schema that was used to create the source analysis and template. QuickSight user – This is an QuickSight user identity acted on by |
amazon-quicksight-dg-003 | amazon-quicksight-dg.pdf | 3 | can share QuickSight dashboards. With the right permissions, you can create scheduled email reports from them. The CreateDashboard and DescribeDashboard API operations act on the dashboard entity. QuickSight template – An entity that encapsulates the metadata required to create an analysis or a dashboard. It abstracts the dataset associated with the analysis by replacing it with placeholders. You can use templates to create dashboards by replacing dataset placeholders with datasets. These datasets need to follow the same schema that was used to create the source analysis and template. QuickSight user – This is an QuickSight user identity acted on by your API call. This user isn't identical to the caller identity but might be the one that maps to the user in QuickSight. 3 Amazon QuickSight Developer Guide Get started with the Amazon QuickSight API You can manage most aspects of your deployment with the AWS SDKs to access an API that's tailored to the programming language or platform that you're using. For more information, see AWS SDKs. For more information about specific API operations, see QuickSight API Reference. Use the topics in this section to get started with the QuickSight API and SDKs. Topics • Prerequisites • Make API requests with the QuickSight SDKs • Use CLI skeleton files • Use the QuickSight Dev Portal Prerequisites If you plan to access QuickSight through its API, make sure you're familiar with the following: • JSON • Web services • HTTP requests • One or more programming languages, such as JavaScript, Java, Python, or C# We recommend visiting the AWS Getting Started Resource Center for a tour of what AWS SDKs and toolkits have to offer. Although you can use a terminal and your favorite text editor, you might benefit from the more visual UI experience you get in an integrated development environment (IDE). We provide a list of IDEs in the AWS Getting Started Resource Center in the IDE and IDE Toolkits section. This site provides AWS toolkits that you can download for your preferred IDE. Some IDEs also offer tutorials to help you learn more about programming languages. Before you can call the QuickSight API operations, make sure that you have the quicksight:operation-name permission in an IAM policy attached to your IAM identity. For example, to call list-users, you need the permission quicksight:ListUsers. The same pattern applies to all operations. Prerequisites 4 Amazon QuickSight Developer Guide If you’re not sure what the necessary permission is, you can attempt to make a call. The client then tells you what the missing permission is. You can use asterisk (*) in the Resource field of your permission policy instead of specifying explicit resources. However, we recommend that you restrict each permission as much as possible. You can restrict user access by specifying or excluding resources in the policy, using their QuickSight Amazon Resource Name (ARN) identifier. For more information, see the following: • IAM Policy Examples in the Amazon QuickSight User Guide • Actions, Resources, and Condition Keys in the IAM User Guide • IAM JSON Policy Elements in the IAM User Guide To retrieve the ARN of a user or a group, use the Describe operation on the relevant resource. You can also add conditions in IAM to further restrict access to an API in some scenarios. For instance, when adding User1 to Group1, the main resource is Group1, so you can allow or deny access to certain groups. However, you can also add a condition by using the QuickSight IAM key quicksight:UserName to allow or prevent certain users from being added to that group. Following is an example policy. It means that the caller with this policy attached can invoke the CreateGroupMembership operation for any group, if the user name they are adding to the group isn't user1. { "Effect": "Allow", "Action": "quicksight:CreateGroupMembership", "Resource": "arn:aws:quicksight:us-east-1:aws-account-id:group/default/*", "Condition": { "StringNotEquals": { "quicksight:UserName": "user1" } } } AWS CLI The following procedure explains how to interact with QuickSight API operations through the AWS Command Line Interface (AWS CLI). The following instructions have been tested in Bash but should be identical or similar in other command-line environments. Prerequisites 5 Amazon QuickSight Developer Guide To use QuickSight API operations through the AWS CLI 1. 2. Install AWS SDK in your environment. For instructions, see AWS Command line Interface. Set up your AWS CLI identity and AWS Region using the following command and follow-up instructions. Use the credentials for an IAM identity or role that has the proper permissions. aws configure 3. Look at QuickSight SDK help by running the following command. aws quicksight help 4. To get detailed instructions on how to use an API, enter its name followed by help, as follows. aws quicksight list-users help 5. Call an QuickSight API operation. The following example returns a list of QuickSight users in your account. |
amazon-quicksight-dg-004 | amazon-quicksight-dg.pdf | 4 | Install AWS SDK in your environment. For instructions, see AWS Command line Interface. Set up your AWS CLI identity and AWS Region using the following command and follow-up instructions. Use the credentials for an IAM identity or role that has the proper permissions. aws configure 3. Look at QuickSight SDK help by running the following command. aws quicksight help 4. To get detailed instructions on how to use an API, enter its name followed by help, as follows. aws quicksight list-users help 5. Call an QuickSight API operation. The following example returns a list of QuickSight users in your account. aws quicksight list-users \ --aws-account-id aws-account-id \ --namespace default \ --region us-east-1 Java SDK Use the following procedure to set up a Java app that interacts with QuickSight. To set up a Java app that works with QuickSight 1. Create a Java project in your IDE. 2. Import the QuickSight SDK into your new project, for example: AWSQuickSightJavaClient-1.11.x.jar 3. After your IDE indexes the QuickSight SDK, add an import line as follows. import com.amazonaws.services.quicksight.AmazonQuickSight; If your IDE doesn't recognize line this as valid, verify that you imported the SDK. Prerequisites 6 Amazon QuickSight Developer Guide 4. Download and import external dependencies for the QuickSight SDK. Like other AWS SDKs, QuickSight SDK requires external dependencies to perform many of its functions. Make sure to download and import those into the same project. The following dependencies are required: • aws-java-sdk-1.11.402.jar (AWS Java SDK and credentials setup) – To download, see Set up the AWS SDK for Java in the SDK for Java documentation. • commons-logging-1.2.jar – To download, see Download Apache Commons Logging on the Apache Commons website. • jackson-annotations-2.9.6.jar, jackson-core-2.9.6.jar, and jackson- databind-2.9.6.jar – To download, see the Maven repository. • httpclient-4.5.6.jar, httpcore-4.4.10.jar – To download, see the Apache site. • joda-time-2.1.jar – To download, see the MVNrepository Joda Time site. 5. Create an QuickSight client. You can use a default public endpoint that the client can communicate with, or you can reference the endpoint explicitly. There are multiple ways to provide your AWS credentials. In the following example, we provide a direct, simple approach. The following client method is used to make all the API calls that follow. private static AmazonQuickSight getClient() { final AWSCredentialsProvider credsProvider = new AWSCredentialsProvider() { @Override public AWSCredentials getCredentials() { // provide actual IAM access key and secret key here return new BasicAWSCredentials("access-key", "secret-key"); } @Override public void refresh() {} }; return AmazonQuickSightClientBuilder .standard() .withRegion(Regions.US_EAST_1.getName()) .withCredentials(credsProvider) .build(); Prerequisites 7 Amazon QuickSight } Developer Guide 6. Use the client that you just created to list all the users in our QuickSight account. Provide the AWS account ID that you used to subscribe to QuickSight. This ID must match the AWS account ID of the caller's identity. Cross-account calls aren't supported at this time. Also, make sure that the required parameter namespace is set to default. getClient().listUsers(new ListUsersRequest() .withAwsAccountId("relevant_AWS_account_ID") .withNamespace("default")) .getUserList().forEach(user -> { System.out.println(user.getArn()); }); 7. See a list of all possible API operations and the request objects they use by choosing the CTRL key and clicking the client object in your IDE view of the QuickSight interface. Or find this list in the com.amazonaws.services.quicksight package in the QuickSight JavaClient .jar file. JavaScript (Node.js) SDK Use the following procedure to interact with QuickSight using Node.js. To work with QuickSight using Node.js 1. Set up your node environment using the following commands: • npm install aws-sdk • npm install aws4 • npm install request • npm install url For information on configuring the Node.js with AWS SDK and setting your credentials, see the AWS SDK for JavaScript Developer Guide for SDK v2. 2. Use the following code example to test your setup. HTTPS is required. The example displays a full listing of QuickSight operations along with their URL request parameters, followed by a list of QuickSight users in your account. Prerequisites 8 Amazon QuickSight Developer Guide const AWS = require('aws-sdk'); const https = require('https'); var quicksight = new AWS.Service({ apiConfig: require('./quicksight-2018-04-01.min.json'), region: 'us-east-1', }); console.log(quicksight.config.apiConfig.operations); quicksight.listUsers({ // Enter your actual AWS account ID 'AwsAccountId': 'relevant_AWS_account_ID', 'Namespace': 'default', }, function(err, data) { console.log('---'); console.log('Errors: '); console.log(err); console.log('---'); console.log('Response: '); console.log(data); }); Python3 SDK Use the following procedure to create a custom-built botocore package to interact with QuickSight. To create a custom botocore package to work with QuickSight 1. Create a credentials file in the AWS directory for your environment. In a Linux- or macOS- based environment, that file is called ~/.aws/credentials and looks like the following. [default] aws_access_key_id = Your_IAM_access_key aws_secret_access_key = Your_IAM_secret_key 2. Unzip the folder botocore-1.12.10. Change directory into botocore-1.12.10 and enter the Python3 interpreter environment. Prerequisites 9 Amazon QuickSight Developer Guide Each response comes back as a dictionary object. They each have a ResponseMetadata entry that contains request IDs and response status. Other entries are based |
amazon-quicksight-dg-005 | amazon-quicksight-dg.pdf | 5 | a custom-built botocore package to interact with QuickSight. To create a custom botocore package to work with QuickSight 1. Create a credentials file in the AWS directory for your environment. In a Linux- or macOS- based environment, that file is called ~/.aws/credentials and looks like the following. [default] aws_access_key_id = Your_IAM_access_key aws_secret_access_key = Your_IAM_secret_key 2. Unzip the folder botocore-1.12.10. Change directory into botocore-1.12.10 and enter the Python3 interpreter environment. Prerequisites 9 Amazon QuickSight Developer Guide Each response comes back as a dictionary object. They each have a ResponseMetadata entry that contains request IDs and response status. Other entries are based on what type of operation you run. 3. As a test, use the following example code, a sample app that first creates, deletes, and lists groups. Then it lists users in a QuickSight account. import botocore.session default_namespace = 'default' account_id = 'relevant_AWS_Account' session = botocore.session.get_session() client = session.create_client("quicksight", region_name='us-east-1') print('Creating three groups: ') client.create_group(AwsAccountId = account_id, Namespace=default_namespace, GroupName='MyGroup1') client.create_group(AwsAccountId = account_id, Namespace=default_namespace, GroupName='MyGroup2') client.create_group(AwsAccountId = account_id, Namespace=default_namespace, GroupName='MyGroup3') print('Retrieving the groups and listing them: ') response = client.list_groups(AwsAccountId = account_id, Namespace=default_namespace) for group in response['GroupList']: print(group) print('Deleting our groups: ') client.delete_group(AwsAccountId = account_id, Namespace=default_namespace, GroupName='MyGroup1') client.delete_group(AwsAccountId = account_id, Namespace=default_namespace, GroupName='MyGroup2') client.delete_group(AwsAccountId = account_id, Namespace=default_namespace, GroupName='MyGroup3') response = client.list_users(AwsAccountId = account_id, Namespace=default_namespace) for user in response['UserList']: print(user) Prerequisites 10 Amazon QuickSight .NET/C# SDK Developer Guide Use the following procedure to interact with QuickSight using C#.NET. This example is constructed on Microsoft Visual for Mac; the instructions can vary slightly based on your IDE and platform. To work with QuickSight using C#.NET 1. Unzip the nuget.zip file into a folder called nuget. 2. Create a new Console app project in Visual Studio. 3. Under your solution, locate app Dependencies, then open the context (right-click) menu and choose Add Packages. 4. In the sources list, choose Configure Sources. 5. Choose Add, and name the source QuickSightSDK. Browse to the nuget folder and choose Add Source. 6. Choose OK. Then, with QuickSightSDK selected, select all three QuickSight packages: • AWSSDK.QuickSight • AWSSDK.Extensions.NETCore.Setup • AWSSDK.Extensions.CognitoAuthentication 7. Choose Add Package. 8. Copy and paste the following sample app into your console app editor. using System; using Amazon.QuickSight.Model; using Amazon.QuickSight; namespace DotNetQuickSightSDKTest { class Program { private static readonly string AccessKey = "insert_your_access_key"; private static readonly string SecretAccessKey = "insert_your_secret_key"; private static readonly string AccountID = "AWS_account_ID"; private static readonly string Namespace = "default"; // leave this as default static void Main(string[] args) { Prerequisites 11 Amazon QuickSight Developer Guide var client = new AmazonQuickSightClient( AccessKey, SecretAccessKey, Amazon.RegionEndpoint.USEast1); var listUsersRequest = new ListUsersRequest { AwsAccountId = AccountID, Namespace = Namespace }; client.ListUsersAsync(listUsersRequest).Result.UserList.ForEach( user => Console.WriteLine(user.Arn) ); var listGroupsRequest = new ListGroupsRequest { AwsAccountId = AccountID, Namespace = Namespace }; client.ListGroupsAsync(listGroupsRequest).Result.GroupList.ForEach( group => Console.WriteLine(group.Arn) ); } } } Make API requests with the QuickSight SDKs You can use API operations for Amazon QuickSight and AWS SDKs to access QuickSight from your preferred programming language. Currently, you can use the Amazon QuickSight API to manage users and groups. In Enterprise Edition, you can also use the API to embed dashboards in your webpage or app. To monitor the calls made to the QuickSight API for your account, including calls made by the AWS Management Console, command line tools, and other services, use AWS CloudTrail. For more information, see the AWS CloudTrail User Guide. AWS provides libraries, sample code, tutorials, and other resources for software developers who prefer to build applications using language-specific API operations instead of submitting a request Make API requests with the QuickSight SDKs 12 Amazon QuickSight Developer Guide over HTTPS. These libraries provide basic functions that automatically take care of tasks such as cryptographically signing your requests, retrying requests, and handling error responses. These libraries help make it easier for you to get started. For more information about downloading the AWS SDKs, see AWS SDKs and Tools. The following links are a sample of the language-specific API documentation available. AWS Command Line Interface • AWS CLI QuickSight Command Reference • AWS CLI User Guide • AWS CLI Command Reference AWS SDK for .NET • Amazon.Quicksight • Amazon.Quicksight.Model AWS SDK for C++ • Aws::QuickSight::QuickSightClient Class Reference AWS SDK for Go • quicksight AWS SDK for Java • QuickSightClient • QuickSightModel AWS SDK for JavaScript • AWS.QuickSight Make API requests with the QuickSight SDKs 13 Developer Guide Amazon QuickSight AWS SDK for PHP • QuickSightClient AWS SDK for Python (Boto3) • QuickSight AWS SDK for Ruby • Aws::QuickSight Use CLI skeleton files To run AWS CLI commands that require long and complicated strings, you can generate CLI skeleton files . A CLI skeleton is a JSON file that provides you with an outline, or skeleton, of the command that you want to run. You can use a CLI skeleton file for every QuickSight command, but skeleton files are |
amazon-quicksight-dg-006 | amazon-quicksight-dg.pdf | 6 | QuickSightClient • QuickSightModel AWS SDK for JavaScript • AWS.QuickSight Make API requests with the QuickSight SDKs 13 Developer Guide Amazon QuickSight AWS SDK for PHP • QuickSightClient AWS SDK for Python (Boto3) • QuickSight AWS SDK for Ruby • Aws::QuickSight Use CLI skeleton files To run AWS CLI commands that require long and complicated strings, you can generate CLI skeleton files . A CLI skeleton is a JSON file that provides you with an outline, or skeleton, of the command that you want to run. You can use a CLI skeleton file for every QuickSight command, but skeleton files are most useful when using Create or Update commands. Generate a CLI skeleton file To generate a CLI skeleton file, enter the following command into your terminal. aws quicksight OPERATION --generate-cli-skeleton A JSON file that contains a skeleton of the command that you want to run then appears in your terminal. Enter the required input values and save the file. The following example shows a cli example that is generated for the UpdateDashboardPermissions API. $ aws quicksight update-dashboard-permissions --generate-cli-skeleton { "AwsAccountId": "", "DashboardId": "", "GrantPermissions":[ { "Principal": "", "Actions": [ "" Use CLI skeleton files 14 Developer Guide Amazon QuickSight ] } ], "RevokePermissions": [ { "Principal": "", "Actions": [ "" ] } ] } Enter the following to make a CLI command using the saved skeleton file. aws quicksight COMMAND --cli-input-json file://filename.json You can update and reuse CLI skeleton files to run future commands. Operations that skeleton files are most useful for You can use a CLI skeleton file for every command in QuickSight. However, skeleton files are most useful for commands that require long or complicated string inputs, such as a permissions update. Following is a list of QuickSight operations where we recommend using a CLI skeleton file: Account customization operations • CreateAccountCustomization • UpdateAccountCustomization • UpdateAccountSettings Analysis operations • CreateAnalysis • UpdateAnalysis • UpdateAnalysisPermissions Operations that skeleton files are most useful for 15 Developer Guide Amazon QuickSight Dashboard operations • CreateDashboard • UpdateDashboard • UpdateDashboardPermissions • UpdateDashboardPublishedVersion Dataset operations • CreateDataSet • UpdateDataSet • UpdateDataSetPermissions Data source operations • CreateDataSource • UpdateDataSource • UpdateDataSourcePermissions Folder operations • CreateFolder • CreateFolderMembership • UpdateFolder • UpdateFolderPermissions Group operations • CreateGroup • CreateGroupMembership • UpdateGroup Operations that skeleton files are most useful for 16 Amazon QuickSight Developer Guide IAM policy assignment operations • CreateIAMPolicyAssignment • UpdateIAMPolicyAssignment Ingestion operations • CreateIngestion Namespace operations • CreateNamespace Template operations • CreateTemplate • UpdateTemplate • UpdateTemplatePermissions Template alias operations • CreateTemplateAlias • UpdateTemplateAlias Theme operations • CreateTheme • UpdateTheme • UpdateThemePermissions Theme alias permissions • CreateThemeAlias • UpdateTemplateAlias Operations that skeleton files are most useful for 17 Amazon QuickSight User operations • RegisterUser • UpdateUser Developer Guide Use the QuickSight Dev Portal The QuickSight Dev Portal helps you learn by example how to use the QuickSight API in your website or application. Currently, the Dev Portal focuses on API operations for embedded analytics. The portal provides easy-to-use code samples to get you started. You can choose from the following three different use cases: • Displaying embedded dashboards to everyone (nonauthenticated users) • Personalizing dashboards for your users Use the QuickSight Dev Portal 18 Amazon QuickSight Developer Guide • Embedding dashboard authoring The portal itself displays dashboards by using embedding for everyone. To get started with the Dev Portal 1. Open QuickSight Dev Portal and choose Try it on the use case that you want to view. 2. View code examples by choosing How to embed it in the menu bar. Then choose each of the following from the navigation pane at left: • Configure permissions • Get embedding URL (code samples in Java, JavaScript, and Python) • Embed URL in your application 3. Choose Download all code to download all of the code in a .zip file. 4. Choose How to customize it to customize the dashboard, This screen is interactive, so you can choose any item in the navigation pane to view the changes live. 5. View and download the HTML code at bottom left. 6. Choose the QuickSight icon at upper left to return to the start page. Use the QuickSight Dev Portal 19 Amazon QuickSight Developer Guide Embed assets with QuickSight QuickSight Embedded Analytics allows users to integrate business analytics into their own applications and web portals. This empowers end users to gain deeper and faster insights through the embedded assets to allow for for easier in-depth analysis of your data. Use QuickSight embedded analytics to provide critical business insights from your Software as a Service (SaaS) product, scale and modernize your enterprise's application on the cloud, and leverage Generative Business Intelligence (BI) to enable end users to use natural language and discover actionable insights. Businesses can embed rich data visuals, interactive dashboards, and advanced ML-powered analytics in minutes. You can embed |
amazon-quicksight-dg-007 | amazon-quicksight-dg.pdf | 7 | Embedded Analytics allows users to integrate business analytics into their own applications and web portals. This empowers end users to gain deeper and faster insights through the embedded assets to allow for for easier in-depth analysis of your data. Use QuickSight embedded analytics to provide critical business insights from your Software as a Service (SaaS) product, scale and modernize your enterprise's application on the cloud, and leverage Generative Business Intelligence (BI) to enable end users to use natural language and discover actionable insights. Businesses can embed rich data visuals, interactive dashboards, and advanced ML-powered analytics in minutes. You can embed the full dashboard-building experience within a portal or application, including the QuickSight home page, search, and data experiences. This allows you to provide ad-hoc data exploration and author capabilities to your application's power users who want to explore usage data, create specific views as dashboards, and share their creations with other users or groups in their organization. Specialized expertise is not required for your team to develop, maintain, and evolve the analytics components for your applications. Teams can easily manage and scale your analytics servers, manage complex data engineering pipelines and infrastructure as your applications' popularity grows. Improve your Business Intelligence with QuickSight Embedded Analytics The use cases below highlight how an embedded analytics tool benefits independent software vendors as well as the program managers and developers at your enterprise. • Add advanced analytics capabilities and natural language processing within your Software as a Service (SaaS) product to improve the overall user experience for the end-users. • Integrate scheduled business intelligence reports directly into your enterprise, which streamlines the availability of data analytics. How Embedded Analytics Can Transform Your Application The list below describes 9 ways embedded analytics can be used to transform your applications: Improve your Business Intelligence with QuickSight Embedded Analytics 20 Amazon QuickSight Developer Guide 1. Leverage Generative BI and Natural Language Processing to your business data Embed powerful AI tools into your application to empower your users with deeper insights and accelerate decision making for your business. 2. Save development time and resources by using enterprise-ready interactive dashboards and visuals Embedded analytics saves valuable engineering time by providing enterprise-ready dashboards and visuals that can be used right away. Once they ae embedded, you can use these embedded dashboards to analyze data and derive actionable insights. 3. Supports highly advanced analytics You can unlock a multitude of advanced features powered by Amazon QuickSight for your embedded analytics, with new features being released by QuickSight frequently. 4. Discover, scale, and securely share valuable business insights at a rapid pace Costly development efforts needed to integrate, manage, analyze, secure, and share data are minimized with QuickSight embedded analytics. QuickSight embedded analytics can also automatically scale up and down depending on your organization’s infrastructural needs. 5. Embed analytics seamlessly to your application You can customize and personalize the look, feel, and functionality of your QuickSight embedded analytics to best meet your requirements. This customization capability allows you to refine available analytics features to align with your brand style. You can customize UX elements like theming and styling of your analytics to match your brand needs. 6. Embedding analytics into your application is easy with AWS cloud technologies Embedded analytics are designed to be used easily. QuickSight provides multiple ways to embed feature rich analytics to your application to meet your business requirements, ranging from 1- click copy and paste to advanced API integration with AWS cloud technologies support. 7. Self-Service BI Capabilities Everyone in your organization can create custom dashboards and reports without the need to rely on your IT professionals and dedicated data analysts. 8. Centralize your analytics How Embedded Analytics Can Transform Your Application 21 Amazon QuickSight Developer Guide Embedded analytics increases operational efficiency and boosts productivity by allowing all analytics to be available in one interface. This eliminates the need to toggle between different platforms to find what you need. 9. Distribute up to date and extensive customer facing reports directly from your application You can schedule automated delivery of reports to ensure that stakeholders receive timely updates without the need to access your application. What to consider when using embedded analytics Review the considerations below before you get started with QuickSight embedded analytics: 1. How much analytic capability would you like to give your users? You can embed different QuickSight experiences in your application that are tailored to how users want to interact with their business intelligence. QuickSight console embedding gives end users the ability to author dashboards and visuals from your application. QuickSight dashboard embedding gives users the power to filter data and create reports. Visual embedding allows you to place individual analytics anywhere on your page and create your own interactive, customizable inline view. 2. How do you want to manage users and govern |
amazon-quicksight-dg-008 | amazon-quicksight-dg.pdf | 8 | you get started with QuickSight embedded analytics: 1. How much analytic capability would you like to give your users? You can embed different QuickSight experiences in your application that are tailored to how users want to interact with their business intelligence. QuickSight console embedding gives end users the ability to author dashboards and visuals from your application. QuickSight dashboard embedding gives users the power to filter data and create reports. Visual embedding allows you to place individual analytics anywhere on your page and create your own interactive, customizable inline view. 2. How do you want to manage users and govern their access to embedded analytics? QuickSight offers you flexibility with user management by allowing you to serve embedded content with or without the need to provison users in QuickSight. User management through QuickSight allows you to delegate the responsibility of capturing user preferences and managing user permissions to QuickSight. Alternatively, if your application already has its own user context, you can choose to embed anonymously without the overhead of user provisioning. With this, you can still govern data access using runtime tag-based row-level security (RLS). 3. How unique do you want your embedded experience to be? Embedded Analytics supports extensive customization features to ensure that the data visualization experience matches your brand. You can customize QuickSight to match the style of your brand, create custom controls from your application that interact with embedded content, and update your application based on events that happen in the embedded experience. 4. What type of application do you want to embed into? Considerations 22 Amazon QuickSight Developer Guide QuickSight embedded analytics can securely integrate and scale into all application types. Enterprise applications can have personalized, authenticated user experiences which can integrate into your directory system. Independent Software Vendors (ISVs) can serve embedded dashboards and visuals to end users without the need to sign in to QuickSight. 5. Should you use natural language insights to visualize data or interactive dashboards? With embedded analytics and Amazon Q, you can embed Natural Language Processing (NLP) for your business data into your application. End users use an embedded search bar to ask questions and visualize data. Get Started with QuickSight embedded analytics Amazon QuickSight is a scalable, embeddable, ML-powered BI Service built for the cloud. Embedded Analytics allows you to easily extend your visuals, dashboards, and Natural Language Queries (Q) to your apps, websites, and portals. Topics • Prerequisites • Choose the right embedding solution • Create your first embedding application Prerequisites Before you get started, familiarize yourself with the list of technologies QuickSight uses to create an embedding experience. Check that the technologies listed below are compatible with your application: • Embedding utilizes Iframes to display your content and MessageChannels to communicate. • If you’re developing a JavaScript-based front-end application, we ecommend you use the Embedding QuickSight data dashboards for registered users in your application to leverage performance, customization, and interactivity capabilities offered through the SDK for your embedded content. • A backend service that is compatible with one of the languages supported in the AWS SDK. • Many web applications use CSP to add security on what can be loaded within the application. Ensure you have ability to allowlist QuickSight domains in your CSP. Get started 23 Amazon QuickSight Developer Guide • Make sure you are using one of our supported browsers. After you confirm that your application is compatible with QuickSight embedding, complete the steps listed in Getting started with Amazon QuickSight. Choose the right embedding solution QuickSight offers multiple options and mechanisms of embedding. You can embed a visual, a dashboard, the complete QuickSight console, or the natural language component Q in your application. Furthermore, based on your organizations authentication setup, you can choose between anonymous and registered user embedding. All of these options are possible when you onboard the QuickSight Embedding APIs. Anonymous user embedding Anonymous user embedding offers a lightweight option for you to bring insights to your users without the need to register them in QuickSight. You can use anonymous embedding to share your dashboards to any number of users in your application and scale as you go. With row-level security, you can further customize your data based on pre-determined context and showcase relevant insights to your users. To get started with anonymous embedding, see Embedding QuickSight data dashboards for anonymous (unregistered) users. Registered user embedding Use registered user embedding to bring QuickSight to every user in your organization. Use QuickSight APIs to share dashboards with users or grant them access to build dashboards from scratch, all within the context of your application. Use comprehensive user context based features like row-level security or column-level-security for fine grained data access control. You can also use features like bookmarks to deliver personalized experiences. To get started with registered user embedding, see Embedding QuickSight |
amazon-quicksight-dg-009 | amazon-quicksight-dg.pdf | 9 | and showcase relevant insights to your users. To get started with anonymous embedding, see Embedding QuickSight data dashboards for anonymous (unregistered) users. Registered user embedding Use registered user embedding to bring QuickSight to every user in your organization. Use QuickSight APIs to share dashboards with users or grant them access to build dashboards from scratch, all within the context of your application. Use comprehensive user context based features like row-level security or column-level-security for fine grained data access control. You can also use features like bookmarks to deliver personalized experiences. To get started with registered user embedding, see Embedding QuickSight data dashboards for registered users. 1-click Enterprise embedding 1-click Enterprise embedding is focused toward enterprises that have QuickSight accounts set up for all of their users. With 1-click embedding, developers embed QuickSight visuals and dashboards Choose the right embedding solution 24 Amazon QuickSight Developer Guide with a static embed code from QuickSight that is added to an <iframe>. When a user accesses a dashboard on your intranet enterprise applications, they are required to sign in to QuickSight. To get started with 1-click embedding, see 1-click Enterprise embedding. 1-click public embedding Use 1-click public embedding to take your insights to public websites with a few clicks. Developers receive an embed code from QuickSight and add it to an <iframe>. You have full control over when dashboards go public with sharing settings that are available on the dashboard. In case of an emergency, you can revoke public sharing within seconds. To get started with 1-click public embedding, see Turn on public access to visuals and dashboards with a 1-click embed code. Use the table below to compare the different features of each embedding option: Embedding option Embed in SaaS application Embed in an Embed in a Embeddable content RLS support Requires user Coding and internal public applicati portal setup insfrastr ucture required — ✓ Maybe* Dashboards, Visuals, Q ✓ (with RLS Tags) — Dashboards, Visuals, Q ✓ ✓ ✓ — Dashboards, Visuals ✓ ✓ — on ✓ ✓ ✓ Anonymous ✓ embedding Registere ✓ d user embedding Not recommended 1- click Enterpris e embedding — 1- click — ✓ Dashboards, Visuals — — — Choose the right embedding solution 25 Amazon QuickSight Developer Guide Embedding option Embed in SaaS application Embed in an Embed in a Embeddable content RLS support Requires user Coding and internal public applicati portal on setup insfrastr ucture required public embedding *Anonymous embedding delegates the responsibility of authentication to the 3P application. If the 3P application does not have user authentication, the embedded content becomes publicly accessible. Create your first embedding application The quickest and most flexible way to embed a dashboard in your web application is to get an embed URL through the QuickSight API and load that onto your application using the QuickSight Embedding SDK, in an iFrame. To do so, you will need: • A backend service to generate the embed URL • An endpoint to pass the embed url to your front end application • (Optional) A Javascript-based front-end application that leverages the embedding SDK to load the dashboard within an iFrame • (Optional) Front-end methods to customize and integrate with the embedded dashboard seamlessly with your application using functions in the embedding SDK To embed a dashboard in a React application, see this example. To set up your first embedded dashboard, see Embedding QuickSight data dashboards for registered users. To set up a different kind of embedded asset, shooce one of the following options: Embedding options for registered users • Embedding QuickSight visuals for registered users • Embed the full functionality of the QuickSight console for registered users Create your first embedding application 26 Amazon QuickSight Developer Guide • Embed the Generative Q&A experience for registered users Embedding options for anonymous (unregistered) users • Embed data dashboards for anonymous (unregistered) users • Embed visuals for anonymous (unregistered) users • Embed the Generative Q&A experience for anonymous (unregistered) users 1-click embedding options • Turn on public access to visuals and dashboards with a 1-click embed code • Embedding visuals and dashboards for registered users with a 1-click embed code Customize embedded assets Use QuickSight embedded analytics to embed custom QuickSight assets into your application that are tailored to meet your business needs. For embedded dashboards and visuals, QuickSight authors can add filters and drill downs that readers can access as they navigate the dashboard or visual. QuickSight developers can also use the QuickSight SDKs to build tighter integrations between their SaaS applications and their QuickSight embedded assets to add datapoint callback actions to visuals on a dashboard at runtime. For more information about the QuickSight SDKs, see the amazon-QuickSight-embedding-sdk on GitHub or NPM. Use the sections listed below to find descriptions of how to use the QuickSight SDKs to customize your QuickSight embedded analytics: |
amazon-quicksight-dg-010 | amazon-quicksight-dg.pdf | 10 | that are tailored to meet your business needs. For embedded dashboards and visuals, QuickSight authors can add filters and drill downs that readers can access as they navigate the dashboard or visual. QuickSight developers can also use the QuickSight SDKs to build tighter integrations between their SaaS applications and their QuickSight embedded assets to add datapoint callback actions to visuals on a dashboard at runtime. For more information about the QuickSight SDKs, see the amazon-QuickSight-embedding-sdk on GitHub or NPM. Use the sections listed below to find descriptions of how to use the QuickSight SDKs to customize your QuickSight embedded analytics: Topics • Control the look and feel of embedded assets • Add interactivity to your embedded content • Personalization Customize embedded assets 27 Amazon QuickSight Developer Guide Control the look and feel of embedded assets Amazon QuickSight embedded experiences offer solutions to developers who want to make embedded content integrate seamlessly with their application. With the embedding SDK, developers can have control over the dimensions, locale, and theming of their embedded dashboards, visuals, and Q Search Bars. These customizations reduce friction for users and make QuickSight feel like a part of your app. To simplify user experience, you can control which toolbar options are available and whether the Q search bar topic can be changed. Embedded content dimensions can be configured and resized to fit into your app and custom themes can be applied to better suit your brand identity. To learn more, see Runtime theming. Add interactivity to your embedded content Amazon QuickSight embedded content offers options for developers to interact with their embedded experiences to create a cohesive experience between their application and QuickSight with the embedding SDK. QuickSight allows application developers to control the experience given to users and to react to user actions taken within their embedded analytics. Processes can be made more efficient by giving users the information that they need when they need it, such as by selecting the relevant sheet or filtering the data to what is relevant to the user. For exmaple, users can use preconfigured visual actions to focus on individual datapoints and carry out tasks in your application without needing to leave QuickSight. The embedding SDK allows for bi-directional interactions between your app and QuickSight. Developers can respond to users changing parameters, switching sheets, clicking on datapoints, and more. From your app to QuickSight, you can set parameters, Q search bar questions, and undo or reset the content state. To learn more about custom assets that can be configured in the embedding SDK, see Dynamic filtering – set parameters, set filters and Respond to events – callbacks Personalization Amazon QuickSight enhances user experience with state persistence to ensure a seamless transition across sessions with the maintenance of the current dashboard state. This feature allows for uninterrupted workflows and a consistent analytics experience. Furthermore, users can optimize their productivity with bookmarks to save and revisit specific views of their dashboard. Control the look and feel of embedded assets 28 Amazon QuickSight Developer Guide Personalization features including state persistence and bookmarks are only available for registered users. To integrate these capabilities into embedded analytics, developers can leverage the QuickSight APIs. This API-driven approach allows developers to customize personalization features within embedded analytics. The QuickSight APIs provide developers with the tools they need to create a tailored and efficient user experience to meet their business needs. State Persistence Use state persistence to ensure a continuous user experience that maintains the current state of a dashboard across different sessions. This means that QuickSight retains information about filters, selected tabs, and other configurations When a user revisits a dashboard, they can pick up where they left off, which eliminates the need to recreate the view each time. State persistence can be utilized to improve user productivity, enhance collaboration, and promote efficient data exploration. Bookmarks Users can utilize bookmarks to save and revisit specific views within QuickSight dashboards to enhance the efficiency and flexibility of data exploration. Bookmarks can be used to improve user productivity, enmahce collaboration, and create user defined views of a QuickSight dashboard. To learn more about bookmarks, see Bookmarking views of a dashboard and GenerateEmbedUrlForRegisteredUser. Embedding security Amazon QuickSight provides a secure platform that allows you to distribute dashboards and insights to tens of thousands of users with multiple-region availability and built-in redundancy. Cloud security at AWS is the highest priority. As an AWS customer, you benefit from a data center and network architecture that is built to meet the requirements of the most security-sensitive organizations. QuickSight manages who sees content By default, QuickSight only allows users who have access to content in the console see that same content in an embedded view. For anonymous (unregistered) users, content access can be governed with row level security (RLS) tags. Additionally, QuickSight has the |
amazon-quicksight-dg-011 | amazon-quicksight-dg.pdf | 11 | platform that allows you to distribute dashboards and insights to tens of thousands of users with multiple-region availability and built-in redundancy. Cloud security at AWS is the highest priority. As an AWS customer, you benefit from a data center and network architecture that is built to meet the requirements of the most security-sensitive organizations. QuickSight manages who sees content By default, QuickSight only allows users who have access to content in the console see that same content in an embedded view. For anonymous (unregistered) users, content access can be governed with row level security (RLS) tags. Additionally, QuickSight has the capability to share assets to anyone on the internet with 1-click public embedding. Embedding security 29 Amazon QuickSight Developer Guide QuickSight manages where you see content QuickSight offers a variety of solutions to control where embedding can take place. To ensure embedding is only done intentionally, QuickSight will only embed on domains that are allow- listed. You can add static domains to your allow-list through the QuickSight console, or you can dynamically add a domain at runtime. Additionally, you can limit access to your organization's QuickSight account to a predefined list of Internet Protocol (IP) address ranges. QuickSight manages what you see QuickSight allows you to restrict access to a dataset. You can do this before or after you have shared the dataset. When a dataset owner views the content, they can still see all the data. When you share the dataset with readers, they can only see the data applicable to them individually, as restricted by the permission dataset rules. QuickSight manages where you see content 30 Amazon QuickSight Developer Guide Amazon Resource Names (ARNs) in QuickSight Amazon Resource Names (ARNs) uniquely identify AWS resources. An ARN identifies a resource unambiguously across all of AWS, for example in IAM policies, Amazon Relational Database Service (Amazon RDS) tags, and API calls. To retrieve the ARN of an QuickSight resource, you can use the Describe operation on the relevant resource. You can use this section to learn how ARNs work. The material here provides examples are geared specifically for QuickSight. Topics • ARN formats • QuickSight resource ARNs • Permissions for QuickSight resources • QuickSight API errors ARN formats ARNs are delimited by colons, and composed of segments, which are the parts separated by colons (:). The specific components and values used in the segments of an ARN depend on which AWS service the ARN is for. The following example shows how ARNs are constructed. arn:partition:service:region:account-id:resource-id arn:partition:service:region:account-id:resource-type/resource-id arn:partition:service:region:account-id:resource-type:resource-id These ARNs contain the following segments: partition – The partition that the resource is in. For standard AWS Regions, the partition is aws. If you have resources in other partitions, the partition is aws-partitionname. For example, the partition for resources in the China (Beijing) Region is aws-cn. service – The service namespace that identifies the AWS product. For example, quicksight identifies QuickSight, s3 identifies Amazon S3, iam identifies IAM, and so on. ARN formats 31 Amazon QuickSight Developer Guide region – The AWS Region that the resource resides in. The ARNs for some resources don't require an AWS Region, so this component might be omitted in some cases, like in the case of S3. QuickSight ARNs require an AWS Region. account-id – The ID of the AWS account that owns the resource. When you use the account number in an ARN or an API operation, you omit the hyphens (for example, 123456789012). The ARNs for some resources don't require an account number, so this component might be omitted. QuickSight ARNs require an AWS account number. However, the account number and the AWS Region are omitted from S3 bucket ARNs, as shown following. arn:aws:s3:::bucket_name arn:aws:s3:::bucket_name/key_name resource or resource-type – The content of this part of the ARN varies by service. A resource identifier can be the name or ID of the resource (for example, user/Bob or instance/ i-1234567890abcdef0) or a resource path. For example, some resource identifiers include a parent resource ( sub-resource-type/parent-resource/sub-resource) or a qualifier such as a version ( resource-type:resource-name:qualifier). Some resource ARNs can include a path, variable, or wildcard. You can use wildcard characters (* and ?) within any ARN segment. An asterisk (*) represents any combination of zero or more characters, and a question mark (?) represents any single character. You can use multiple * or ? characters in each segment. If you're using the ARN for permissions, avoid using * wildcards if possible, to limit access to only the required elements. Following are some examples of using paths, wildcards, and variables. For the following example, we use an S3 ARN. You might use this when you give permissions to S3 in an IAM policy. This S3 ARN shows a path and file are specified. Note The term key name is used to describe what looks like a path and |
amazon-quicksight-dg-012 | amazon-quicksight-dg.pdf | 12 | characters, and a question mark (?) represents any single character. You can use multiple * or ? characters in each segment. If you're using the ARN for permissions, avoid using * wildcards if possible, to limit access to only the required elements. Following are some examples of using paths, wildcards, and variables. For the following example, we use an S3 ARN. You might use this when you give permissions to S3 in an IAM policy. This S3 ARN shows a path and file are specified. Note The term key name is used to describe what looks like a path and file after bucketname/. These are called key names because a bucket doesn't actually contain folder structures like those used in your computer's file system. Instead the slash (/) is a delimiter that helps to make the organization of the bucket more intuitive. In this case, the bucket name is amzn- s3-demo-bucket, and the key name is developers/design_info.doc. ARN formats 32 Amazon QuickSight Developer Guide arn:aws:s3:::amzn-s3-demo-bucket/my-data/sales-export-2019-q4.json To identify all the objects in the bucket, you can use a wildcard to indicate that all key names (or paths and files) are included in the ARN, as follows. arn:aws:s3:::amzn-s3-demo-bucket/* You can use part of a key name plus the wildcard to identify all the objects that begin with a specific pattern. In this case, it resembles a folder name plus a wildcard, as shown following. However, this ARN also includes any "subfolders" inside of my-data. arn:aws:s3:::amzn-s3-demo-bucket/my-data/sales-export* In this case, specifying using this wildcard includes the objects with names like the following: • my-data/sales-export-1.xlsx • my-data/sales-export-new.txt • my-data/sales-export-2019/file1.txt You can use wildcards of both types (asterisks and question marks) in combination or separately, as shown following. arn:aws:s3:::amzn-s3-demo-bucket/my-data/sales-export-2019-q?.* arn:aws:s3:::amzn-s3-demo-bucket/my-data/sales-export-20??-q?.* Or, if you want to future-proof the ARN, you can replace the entire year with a wildcard, rather than just using wildcards for the last two digits. arn:aws:s3:::amzn-s3-demo-bucket/my-data/sales-export-????-q?.* arn:aws:s3:::amzn-s3-demo-bucket/my-data/sales-export-*-q?.* To read more about S3 ARNs, see Specifying Resources in a Policy and Object Key and Metadata in the Amazon S3 User Guide. ARN formats 33 Amazon QuickSight Developer Guide QuickSight resource ARNs The following resource types are defined by QuickSight: user, group, and dashboard. These are used in QuickSight API calls and as elements of IAM permission statements. To find up-to-date information for QuickSight (service prefix: quicksight) resources, actions, and condition context keys for use in IAM permission policies, see Actions, Resources, and Condition Keys for QuickSight in the IAM User Guide. Resource type ARN format Condition keys user group dashboard arn:${Partition}:quicksight: ${Region}:${Account}:user/ N/A ${ResourceId} arn:${Partition}:quicksight: ${Region}:${Account}:group/ N/A ${ResourceId} arn:${Partition}:quicksight: ${Region}:${Account}:dashb N/A oard/${ResourceId} Resource ARNs are constructed from the segments that describe your resource. For example, a resource ARN for an analysis consists of the following segments. arn:<partition>:quicksight:<aws-region>:<aws-account-id>:<resource-type>/<resource-id> The segments are defined as follows: • partition – for example, aws or aws-cn. • aws-region – The AWS Region that contains the resource. • aws-account-id – The AWS account that contains the resource. This ID excludes the hyphens. • resource-type – The type of resource. For this example, this is analyses. For a dashboard, it's dashboard. • resource-id – The unique identifier for a specific resource. QuickSight resource ARNs 34 Amazon QuickSight Developer Guide The AWS Region, resource type, and resource ID are identified in the URL of the resource when you are using the QuickSight console. For example, let's say this is the URL of the analysis that you want an ARN for. https://us-east-2.quicksight.aws.amazon.com/sn/analysis/4036e682-7de6-4c05-8a76- be51b9ec9b29 The AWS Region is us-east-2. The resource-type is analysis. The resource ID in this URL is 4036e682-7de6-4c05-8a76-be51b9ec9b29. If your account number is 111122223333, then the ARN for this analysis is as follows. arn:aws:quicksight:us-east-2:111122223333:analysis/4036e682-7de6-4c05-8a76-be51b9ec9b29 To get your AWS account number, contact your system administrator. Permissions for QuickSight resources If you're not sure what the necessary permission is, you can attempt to make a call. The client then tells you what the missing permission is. You can use asterisk (*) in the Resource field of your permission policy instead of specifying explicit resources. However, we highly recommend that you restrict each permission as much as possible. You can restrict user access by specifying or excluding resources in the policy, using their QuickSight ARN. To retrieve the ARN of an QuickSight resource, use the Describe operation on the relevant resource. Before you can call the QuickSight API operations, you need the quicksight:operation- name permission in a policy attached to your IAM identity. For example, to call list-users, you need the permission quicksight:ListUsers. The same pattern applies to all operations. If you attempt to make the call you don't have permissions to call, the resulting error shows you what the missing permission is. We highly recommend that you restrict each permission as much as possible. You can add conditions in IAM to further restrict access to an API in some scenarios. For example, when you |
amazon-quicksight-dg-013 | amazon-quicksight-dg.pdf | 13 | Describe operation on the relevant resource. Before you can call the QuickSight API operations, you need the quicksight:operation- name permission in a policy attached to your IAM identity. For example, to call list-users, you need the permission quicksight:ListUsers. The same pattern applies to all operations. If you attempt to make the call you don't have permissions to call, the resulting error shows you what the missing permission is. We highly recommend that you restrict each permission as much as possible. You can add conditions in IAM to further restrict access to an API in some scenarios. For example, when you add User1 to Group1, the main resource is Group1. You can allow or deny access to certain groups. Or you can also edit the QuickSight IAM key quicksight:UserName to add a condition to allow or prevent certain users from being added to that group. For more information, see the following: • Actions, Resources, and Condition Keys • IAM JSON Policy Elements Permissions 35 Amazon QuickSight Best practices Developer Guide By working with QuickSight, you can share analyses, dashboards, templates, and themes with up to 100 principals. A principal can be one of the following: • The Amazon Resource Name (ARN) of an QuickSight user or group associated with a data source or dataset. (This is common.) • The ARN of an QuickSight user, group, or namespace associated with an analysis, dashboard, template, or theme. (This is common.) • The ARN of an AWS account root: This is an IAM ARN rather than a QuickSight ARN. Use this option only to share resources (templates) across AWS accounts. (This is less common.) To share these resources with more principals, consider assigning resource permissions at the group or namespace level. For example, if you add users into a group and share a resource to the group, the group counts as one principal. This is true even though it's shared to everyone in the group. QuickSight API errors QuickSight has two types of error codes: • Client errors – These errors are usually caused by something the client did. An example is specifying an incorrect or invalid parameter in the request. Another is using an action or resource for a user that doesn't have permission to use the action or resource. These errors are accompanied by a 400-series HTTP response code. • Server errors – These errors are usually caused by an AWS server-side issue. These errors are accompanied by a 500-series HTTP response code. Topics • Common client errors • Client errors • Server errors Common client errors Following, you can find a list of the common client errors that all operations can return. Best practices 36 Amazon QuickSight Error code AuthFailure Blocked DryRunOperation IdempotentParameterMismatch IncompleteSignature InvalidAction Developer Guide Description The provided credentials couldn't be validated . You might not be authorized to carry out the request. Ensure that your account is authorized to use the QuickSight service and that your credit card details are correct. Also ensure that you're using the correct access keys. Your account is currently blocked. Contact Support if you have questions. The user has the required permissions, so the request would have succeeded, but the DryRun parameter was used. The request uses the same client token as a previous, but nonidentical request. Don't reuse a client token with different requests, unless the requests are identical. The request signature doesn't conform to AWS standards. The action or operation requested isn't valid. Verify that the action is typed correctly. InvalidCharacter A specified character isn't valid. InvalidClientTokenId InvalidPaginationToken InvalidParameter The X.509 certificate or AWS access key ID provided doesn't exist in our records. The specified pagination token isn't valid or is expired. A parameter specified in a request isn't valid, is unsupported, or can't be used. The returned message provides an explanation of the error value. Common client errors 37 Amazon QuickSight Error code InvalidParameterCombination InvalidParameterValue InvalidQueryParameter Developer Guide Description A value that indicates an incorrect combination of parameters, or a missing parameter. A value specified in a parameter isn't valid, is unsupported, or can't be used. Ensure that you specify a resource by using its full ID. The returned message provides an explanation of the error value. The AWS query string is malformed or doesn't adhere to AWS standards. MalformedQueryString The query string contains a syntax error. MissingAction MissingAuthenticationToken MissingParameter OptInRequired PendingVerification The request is missing an action or a required parameter. The request must contain either a valid (register ed) AWS access key ID or X.509 certificate. The request is missing a required parameter . Ensure that you have supplied all the requir ed parameters for the request, for example the resource ID. You aren't authorized to use the requested service. Ensure that you have subscribed to the service you are trying to use. If you are new to |
amazon-quicksight-dg-014 | amazon-quicksight-dg.pdf | 14 | string is malformed or doesn't adhere to AWS standards. MalformedQueryString The query string contains a syntax error. MissingAction MissingAuthenticationToken MissingParameter OptInRequired PendingVerification The request is missing an action or a required parameter. The request must contain either a valid (register ed) AWS access key ID or X.509 certificate. The request is missing a required parameter . Ensure that you have supplied all the requir ed parameters for the request, for example the resource ID. You aren't authorized to use the requested service. Ensure that you have subscribed to the service you are trying to use. If you are new to AWS, your account might take some time to be activated while your credit card details are being verified. Your account is pending verification. Until the verification process is complete, you might not be able to carry out requests with this account. If you have questions, contact AWS Support. Common client errors 38 Amazon QuickSight Error code RequestExpired UnauthorizedOperation UnknownParameter Developer Guide Description The request reached the service more than 15 minutes after the date stamp on the request or the request expiration date (such as for presigned URLs). Or the date stamp on the request is more than 15 minutes in the future. If you're using temporary security credentials, this error can also occur if the credentials have expired. For more information, see Temporary Security Credentials in the IAM User Guide. You aren't authorized to perform this operation. Check your IAM policies, and ensure that you are using the correct access keys. An unknown or unrecognized parameter was supplied. Requests that can cause this error include supplying a misspelled parameter or a parameter that isn't supported for the specified API version. UnsupportedInstanceAttribute The specified attribute can't be modified. UnsupportedOperation The specified request includes an unsupport ed operation. The returned message provides details of the unsupported operation. UnsupportedProtocol The protocol that you used is unsupported. ValidationError The input fails to satisfy the constraints specified by an AWS service. Client errors Following, you can find a list of client errors that are specific to QuickSight API operations. Client errors 39 Amazon QuickSight Error code AccessDeniedException DomainNotWhiteListedException IdentityTypeNotSupportedException Developer Guide Description You don't have access to this. The provided credentials can't be validated. You might not be authorized to carry out the request. Ensure that your account is authorized to use the Amazon QuickSight service and that your policies have the correct permissions. Also, ensure that you are using the correct access keys. The domain specified isn't on the allow list. Make sure that all domains for embedded dashboards are added to the approved list by an Amazon QuickSight administrator. The identity type specified isn't supported . Supported identity types include: IAM and QUICKSIGHT. InvalidNextTokenException The NextToken value isn't valid. InvalidParameterValueException One or more parameters doesn't have a valid value. PreconditionNotMetException One or more preconditions aren't met. QuickSightUserNotFoundException The user isn't found. This can happen in any operation that requires finding a user based on the provided user name, such as DeleteUser , DescribeUser , and so on. ResourceExistsException The resource specified doesn't exist. ResourceNotFoundException One or more resources couldn't be found. SessionLifetimeInMinutesInvalidException The number of minutes specified for the lifetime of a session is invalid. The session lifetime must be 15–600 minutes. Client errors 40 Amazon QuickSight Error code Description Developer Guide ThrottlingException Access is throttled. UnsupportedUserEditionException Indicates that you are calling an operation on an Amazon QuickSight subscription where the edition doesn't include support for that operation. Amazon QuickSight currently has Standard Edition and Enterprise Edition. Not every operation and capability is available in every edition. Common causes of client errors There are a number of reasons that you might encounter an error while performing a request. Some errors can be prevented or easily solved by following these guidelines: • Specify the AWS account ID and namespace – Make sure that the relevant AWS account ID are provided with each request. The namespace must be set to default. • Allow for eventual consistency – Some errors are caused because a previous request hasn't yet propagated through the system. • Use a sleep interval between request rates – QuickSight API requests are throttled to help maintain the performance of the service. If your requests have been throttled, you get an error. • Use the full ID of the resource – When specifying a resource, ensure that you use its full ID, and not its user-supplied name or description. • Check your services – Ensure that you have signed up for all the services you are attempting to use. You can check which services you're signed up for by going to the My Account section of the AWS home page. • Check your permissions – Ensure that you have the required permissions to carry out the request. • Check your VPC – Some |
amazon-quicksight-dg-015 | amazon-quicksight-dg.pdf | 15 | your requests have been throttled, you get an error. • Use the full ID of the resource – When specifying a resource, ensure that you use its full ID, and not its user-supplied name or description. • Check your services – Ensure that you have signed up for all the services you are attempting to use. You can check which services you're signed up for by going to the My Account section of the AWS home page. • Check your permissions – Ensure that you have the required permissions to carry out the request. • Check your VPC – Some resources can't be shared between virtual private clouds (VPCs), for example security groups. • Check your credentials – Ensure that you provide your access keys when you are making requests and that you have entered the credentials correctly. Also, if you have more than one Client errors 41 Amazon QuickSight Developer Guide account, ensure that you are using the correct credentials for a particular account. If the provided credentials are incorrect, you might get the following error: Client.AuthFailure. Server errors Following, you can find a list of errors that can be returned by the server. Error code Description BatchClientRequestTokensNotDistinctE xception The batch client request tokens aren't unique. EmptyBatchRequestException The batch request was empty. InternalFailureException An internal failure occurred. InternalServiceError There was an internal error from the service. InvalidBatchClientRequestTokenException The AWS request token for this client batch request is invalid. InvalidParameterException One or more parameters has a value that isn't valid. LimitExceededException A limit is exceeded. ResourceUnavailableException This resource is currently unavailable. TooManyEntriesInBatchRequestException There are too many entries in this batch request. Server errors 42 Amazon QuickSight Operations To find QuickSight API operations by category, use the following list. Developer Guide Topics • Account customization operations • Analysis operations • Asset bundle operations • Dashboard operations • Data source operations • Dataset operations • Folder operations • Group operations • IAM policy assignment operations • Ingestion operations • IP and VPC endpoint restriction operations • Key management operations • Namespace operations • Tag operations • Template alias operations • Template operations • Theme operations • Theme alias operations • User operations Account customization operations With account customization API operations, you can update and customize Amazon QuickSight account settings. For more information, see the following API operations. Topics • Account settings Account customization operations 43 Amazon QuickSight Developer Guide • CreateAccountCustomization • DeleteAccountCustomization • DescribeAccountCustomization • UpdateAccountCustomization Account settings With account settings operations, you can perform actions on QuickSight account settings. For more information, see the following API operations. Topics • DescribeAccountSettings • UpdateAccountSettings DescribeAccountSettings Use the DescribeAccountSettings API operation to describe the settings that were used when your Amazon QuickSight subscription was first created in this AWS account. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight describe-account-settings --aws-account-id 555555555555 For more information about the DescribeAccountSettings API operation, see DescribeAccountSettings in the Amazon QuickSight API Reference. UpdateAccountSettings Use the UpdateAccountSettings API operation to update the Amazon QuickSight settings in your AWS account. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight update-account-settings --aws-account-id 555555555555 Account settings 44 Amazon QuickSight Developer Guide --default-namespace NAMESPACE --notification-email EMAIL You can also make this command using a CLI skeleton file with the following command. For more information about CLI skeleton files, see Use CLI skeleton files. aws quicksight update-account-settings --cli-input-json file://updateaccountsettings.json For more information about the UpdateAccountSettings API operation, see UpdateAccountSettings in the Amazon QuickSight API Reference. CreateAccountCustomization Use the CreateAccountCustomization API operation to create Amazon QuickSight customizations in the current AWS Region. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight create-account-customization --aws-account-id 555555555555 --account-customization DEFAULTTHEME You can also make this command using a CLI skeleton file with the following command. For more information about CLI skeleton files, see Use CLI skeleton files. aws quicksight create-account-customization --cli-input-json file://createaccountcustomization.json For more information about the CreateAccountCustomization API operation, see CreateAccountCustomization in the Amazon QuickSight API Reference. DeleteAccountCustomization Use the DeleteAccountCustomization API operation to delete all Amazon QuickSight customizations in this AWS Region for the specified AWS account and QuickSight namespace. Following is an example AWS CLI command for this operation. CreateAccountCustomization 45 Amazon QuickSight AWS CLI aws quicksight delete-account-customization --aws-account-id 555555555555 Developer Guide For more information about the DeleteAccountCustomization API operation, see DeleteAccountCustomization in the Amazon QuickSight API Reference. DescribeAccountCustomization Use the DescribeAccountCustomization API operation to describe the customizations associated with the provided AWS account and Amazon QuickSight namespace in an AWS Region. Following is an example AWS CLI command for this operation. AWS CLI quicksight describe-account-customization --aws-account-id 555555555555 For more information about the DescribeAccountCustomization API operation, see DescribeAccountCustomization in the Amazon QuickSight API Reference. UpdateAccountCustomization Use the UpdateAccountCustomization API operation to update Amazon QuickSight customizations in the current AWS Region. |
amazon-quicksight-dg-016 | amazon-quicksight-dg.pdf | 16 | 45 Amazon QuickSight AWS CLI aws quicksight delete-account-customization --aws-account-id 555555555555 Developer Guide For more information about the DeleteAccountCustomization API operation, see DeleteAccountCustomization in the Amazon QuickSight API Reference. DescribeAccountCustomization Use the DescribeAccountCustomization API operation to describe the customizations associated with the provided AWS account and Amazon QuickSight namespace in an AWS Region. Following is an example AWS CLI command for this operation. AWS CLI quicksight describe-account-customization --aws-account-id 555555555555 For more information about the DescribeAccountCustomization API operation, see DescribeAccountCustomization in the Amazon QuickSight API Reference. UpdateAccountCustomization Use the UpdateAccountCustomization API operation to update Amazon QuickSight customizations in the current AWS Region. Currently, the only customization that you can use is a theme. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight update-account-customization --aws-account-id 555555555555 --namespace NAMESPACE --account-customization DEFAULTTHEME You can also make this command using a CLI skeleton file with the following command. For more information about CLI skeleton files, see Use CLI skeleton files. DescribeAccountCustomization 46 Amazon QuickSight Developer Guide aws quicksight update-account-customization --cli-input-json file://updateaccountcustomization.json For more information about the UpdateAccountCustomization API operation, see UpdateAccountCustomization in the Amazon QuickSight API Reference. Analysis operations With analysis API operations, you can perform actions on Amazon QuickSight analyses. For more information, see the following API operations. Topics • Analysis permissions operations • CreateAnalysis • DeleteAnalysis • DescribeAnalysis • ListAnalyses • RestoreAnalysis • SearchAnalyses • UpdateAnalysis Analysis permissions operations With analysis permissions API operations, you can view and update permissions for analyses. For more information, see the following API operations. Topics • DescribeAnalysisPermissions • UpdateAnalysisPermissions DescribeAnalysisPermissions Use the DescribeAnalysisPermissions API operation to view the read and write permissions for an analysis. To use this operation, you need the ID of the analysis whose permissions you Analysis operations 47 Amazon QuickSight Developer Guide want to view. The analysis ID is part of the analysis URL in QuickSight. You can also use the ListAnalyses API operation to get the ID. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight describe-analysis-permissions --aws-account-id 555555555555 --analysis-id ANALYSISID For more information about the DescribeAnalysisPermissions API operation, see DescribeAnalysisPermissions in the Amazon QuickSight API Reference. UpdateAnalysisPermissions Use the UpdateAnalysisPermissions API operation to update the read and write permissions for an analysis. You can grant or revoke permissions in the same command. To use this operation, you need the ID of the analysis whose permissions you want to update. The analysis ID is part of the analysis URL in QuickSight. You can also use the ListAnalyses API operation to get the ID. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight update-analysis-permissions --aws-account-id 555555555555 --analysis-id ANALYSISID --grant-permissions Principal=arn:aws:quicksight:us-east-1:AWSACCOUNTID:user/ default/ USERNAME,Actions=quicksight:RestoreAnalysis,quicksight:UpdateAnalysisPermissions,quicksight:DeleteAnalysis,quicksight:QueryAnalysis,quicksight:DescribeAnalysisPermissions,quicksight:DescribeAnalysis,quicksight:UpdateAnalysis --revoke-permissions Principal=arn:aws:quicksight:us-east-1:555555555555:user/ default/ USERNAME,Actions=quicksight:RestoreAnalysis,quicksight:UpdateAnalysisPermissions,quicksight:DeleteAnalysis,quicksight:QueryAnalysis,quicksight:DescribeAnalysisPermissions,quicksight:DescribeAnalysis,quicksight:UpdateAnalysis If your region has already been configured within the CLI, it doesn't need to be included as an argument. You can also make this command using a CLI skeleton file with the following command. For more information about CLI skeleton files, see Use CLI skeleton files. Analysis permissions operations 48 Amazon QuickSight Developer Guide aws quicksight update-analysis-permissions --cli-input-json file://updateanalysispermissions.json If your region has already been configured with the CLI, it does not need to be included in an argument. For more information about the UpdateAnalysisPermissions API operation, see UpdateAnalysisPermissions in the Amazon QuickSight API Reference. CreateAnalysis Use the CreateAnalysis API operation to create an analysis in Amazon QuickSight for a specified user. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight create-analysis --aws-account-id AWSACCOUNTID --analysis-id ANALYSISID --name NAME --source-entity SOURCEENTITY You can also make this command using a CLI skeleton file with the following command. For more information about CLI skeleton files, see Use CLI skeleton files. aws quicksight create-analysis --cli-input-json file://createanalysis.json For more information about the CreateAnalysis API operation, see CreateAnalysis in the Amazon QuickSight API Reference. DeleteAnalysis Use the DeleteAnalysis API operation to delete an analysis from Amazon QuickSight for a specified user. To use this operation, you need the ID of the analysis that you want to delete. The analysis ID is part of the analysis URL in QuickSight. You can also use the ListAnalyses API operation to get the ID. CreateAnalysis 49 Amazon QuickSight Developer Guide Following is an example AWS CLI command for this operation. AWS CLI aws quicksight delete-analysis --aws-account-id 555555555555 --analysis-id ANALYSISID For more information about the DeleteAnalysis API operation, see DeleteAnalysis in the Amazon QuickSight API Reference. DescribeAnalysis Use the DescribeAnalysis API operation to view a summary of the metadata for an analysis for a specified user. To use this operation, you need the ID of the analysis that you want to describe. The analysis ID is part of the analysis URL in QuickSight. You can also use the ListAnalyses API operation to get the ID. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight |
amazon-quicksight-dg-017 | amazon-quicksight-dg.pdf | 17 | this operation. AWS CLI aws quicksight delete-analysis --aws-account-id 555555555555 --analysis-id ANALYSISID For more information about the DeleteAnalysis API operation, see DeleteAnalysis in the Amazon QuickSight API Reference. DescribeAnalysis Use the DescribeAnalysis API operation to view a summary of the metadata for an analysis for a specified user. To use this operation, you need the ID of the analysis that you want to describe. The analysis ID is part of the analysis URL in QuickSight. You can also use the ListAnalyses API operation to get the ID. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight describe-analysis --analysis-id ANALYSISID --aws-account-id 555555555555 For more information about the DescribeAnalysis API operation, see DescribeAnalysis in the Amazon QuickSight API Reference. ListAnalyses Use the ListAnalyses API operation to list Amazon QuickSight analyses that exist in the specified AWS account. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight list-analyses --aws-account-id 555555555555 --page-size 10 DescribeAnalysis 50 Amazon QuickSight --max-items 10 Developer Guide For more information about the ListAnalyses API operation, see ListAnalyses in the Amazon QuickSight API Reference. RestoreAnalysis Use the RestoreAnalysis API operation to restore an analysis for a specified user. To use this operation, you need the ID of the analysis that you want to restore. The analysis ID is part of the analysis URL in QuickSight. You can also use the ListAnalyses API operation to get the ID. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight restore-analysis --analysis-id ANALYSISID --aws-account-id 555555555555 For more information about the RestoreAnalysis API operation, see RestoreAnalysis in the Amazon QuickSight API Reference. SearchAnalyses Use the SearchAnalyses API operation to search for analyses that belong to the specified user. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight search-analyses --aws-account-id 555555555555 --filters Operator=StringEquals,Name=QUICKSIGHT_USER,Value=arn:aws:quicksight:us- east-1:555555555555:user/default/USERNAME --page-size 10 --max-items 100 If your region has already been configured within the CLI, it doesn't need to be included as an argument. RestoreAnalysis 51 Amazon QuickSight Developer Guide If your region has already been configured with the CLI, it does not need to be included in an argument. For more information about the SearchAnalyses API operation, see SearchAnalyses in the Amazon QuickSight API Reference. UpdateAnalysis Use the UpdateAnalysis API operation to update an analysis in Amazon QuickSight. To use this operation, you need the ID of the analysis that you want to update. The analysis ID is part of the analysis URL in QuickSight. You can also use the ListAnalyses API operation to get the ID. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight update-analysis --aws-account-id 555555555555 --analysis-id ANALYSISID --name NAME --source-entity '{"SourceTemplate":{"DataSetReferences": [{"DataSetPlaceholder":"PLACEHOLDER","DataSetArn":"arn:aws:quicksight:us- west-2:555555555555:dataset/DATASETID"}],"Arn":"arn:aws:quicksight:us- west-2:555555555555:template/TEMPLATEID"}}' --theme-arn THEMEARN If your region has already been configured within the CLI, it doesn't need to be included as an argument. You can also make this command using a CLI skeleton file with the following command. For more information about CLI skeleton files, see Use CLI skeleton files. aws quicksight update-analysis --cli-input-json file://updateanalysis.json If your region has already been configured with the CLI, it does not need to be included in an argument. For more information about the UpdateAnalysis API operation, see UpdateAnalysis in the Amazon QuickSight API Reference. UpdateAnalysis 52 Amazon QuickSight Developer Guide Asset bundle operations Use QuickSight asset bundle API operations to export QuickSight assets from one account to another. Asset bundle operations can be used to back up or restore deleted work, promote new work into a production account, or to duplicate assets within an account or across multiple accounts. Asset bundle import and export operations support the following asset types: • Analyses • Dashboards • Data sources • Datasets • Shared folders • Restricted folders • Refresh schedules • Themes • VPC connections The following data sources and dataset types aren't supported by the asset bundle APIs. • Adobe Analytics • File • GitHub • JIRA • Salesforce • ServiceNow • Twitter Topics • Permissions • Asset bundle export operations • Asset bundle import operations Asset bundle operations 53 Amazon QuickSight Permissions Developer Guide Before you begin, verify that you have an AWS Identity and Access Management role that grants the CLI user access to call the QuickSight asset bundle API operations. QuickSight recommends that you use the AWSQuickSightAssetBundleExportPolicy and AWSQuickSightAssetBundleImportPolicy IAM managed policies to streamline your API usage. You can also choose to explicitly define your oen IAM policy to fit your use case. For more information about IAM managed policies in QuickSight, see AWS managed policies for QuickSight. The following example shows an IAM policy that you can add to an existing IAM role to use the StartAssetBundleExportJob operation. { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "quicksight:DescribeAssetBundleExportJob", "quicksight:ListAssetBundleExportJobs", "quicksight:StartAssetBundleExportJob", "quicksight:DescribeAnalysis", "quicksight:DescribeDashboard", "quicksight:DescribeDataSet", "quicksight:DescribeDataSetRefreshProperties", "quicksight:DescribeDataSource", "quicksight:DescribeRefreshSchedule", "quicksight:DescribeTheme", "quicksight:DescribeVPCConnection", "quicksight:ListRefreshSchedules", "quicksight:DescribeAnalysisPermissions", "quicksight:DescribeDashboardPermissions", "quicksight:DescribeDataSetPermissions", |
amazon-quicksight-dg-018 | amazon-quicksight-dg.pdf | 18 | the QuickSight asset bundle API operations. QuickSight recommends that you use the AWSQuickSightAssetBundleExportPolicy and AWSQuickSightAssetBundleImportPolicy IAM managed policies to streamline your API usage. You can also choose to explicitly define your oen IAM policy to fit your use case. For more information about IAM managed policies in QuickSight, see AWS managed policies for QuickSight. The following example shows an IAM policy that you can add to an existing IAM role to use the StartAssetBundleExportJob operation. { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "quicksight:DescribeAssetBundleExportJob", "quicksight:ListAssetBundleExportJobs", "quicksight:StartAssetBundleExportJob", "quicksight:DescribeAnalysis", "quicksight:DescribeDashboard", "quicksight:DescribeDataSet", "quicksight:DescribeDataSetRefreshProperties", "quicksight:DescribeDataSource", "quicksight:DescribeRefreshSchedule", "quicksight:DescribeTheme", "quicksight:DescribeVPCConnection", "quicksight:ListRefreshSchedules", "quicksight:DescribeAnalysisPermissions", "quicksight:DescribeDashboardPermissions", "quicksight:DescribeDataSetPermissions", "quicksight:DescribeDataSourcePermissions", "quicksight:DescribeThemePermissions", "quicksight:ListTagsForResource" ], "Resource": "*" } ] Permissions 54 Amazon QuickSight } Developer Guide The following example shows an IAM policy that you can add to an existing IAM role to use the StartAssetBundleImportJob operation. { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "quicksight:DescribeAssetBundleImportJob", "quicksight:ListAssetBundleImportJobs", "quicksight:StartAssetBundleImportJob", "quicksight:CreateAnalysis", "quicksight:DeleteAnalysis", "quicksight:DescribeAnalysis", "quicksight:UpdateAnalysis", "quicksight:CreateDashboard", "quicksight:DeleteDashboard", "quicksight:DescribeDashboard", "quicksight:UpdateDashboard", "quicksight:UpdateDashboardLinks", "quicksight:UpdateDashboardPublishedVersion", "quicksight:CreateDataSet", "quicksight:DeleteDataSet", "quicksight:DescribeDataSet", "quicksight:PassDataSet", "quicksight:UpdateDataSet", "quicksight:DeleteDataSetRefreshProperties", "quicksight:DescribeDataSetRefreshProperties", "quicksight:PutDataSetRefreshProperties", "quicksight:CreateRefreshSchedule", "quicksight:DescribeRefreshSchedule", "quicksight:DeleteRefreshSchedule", "quicksight:ListRefreshSchedules", "quicksight:UpdateRefreshSchedule", "quicksight:CreateDataSource", "quicksight:DescribeDataSource", "quicksight:DeleteDataSource", "quicksight:PassDataSource", "quicksight:UpdateDataSource", "quicksight:CreateTheme", Permissions 55 Amazon QuickSight Developer Guide "quicksight:DeleteTheme", "quicksight:DescribeTheme", "quicksight:UpdateTheme", "quicksight:CreateVPCConnection", "quicksight:DescribeVPCConnection", "quicksight:DeleteVPCConnection", "quicksight:UpdateVPCConnection", "quicksight:DescribeAnalysisPermissions", "quicksight:DescribeDashboardPermissions", "quicksight:DescribeDataSetPermissions", "quicksight:DescribeDataSourcePermissions", "quicksight:DescribeThemePermissions", "quicksight:UpdateAnalysisPermissions", "quicksight:UpdateDashboardPermissions", "quicksight:UpdateDataSetPermissions", "quicksight:UpdateDataSourcePermissions", "quicksight:UpdateThemePermissions", "quicksight:ListTagsForResource", "quicksight:TagResource", "quicksight:UntagResource", "s3:GetObject", "iam:PassRole" ], "Resource": "*" } ] } Asset bundle export operations QuickSight developers can use asset bundle export operations to export existing QuickSight assets and download their definitions to be saved in your own storage. QuickSight doesn't export data contained within the asset. These exported assets can be imported back into QuickSight whenever you want. To export QuickSight assets, start a named asynchronous export job. Then, poll for the job's completion, and then download the asset bundle with the download URL that's provided by the QuickSight API. Assets that are exported with the QuickSight asset bundle APIs can be exported as a QuickSight JSON bundle or a CloudFormation JSON file. When you run an export job that generates a QuickSight JSON file, the job returns a .qs zip file. The file can be unzipped to access the exported asset definitions. Asset bundle export operations 56 Amazon QuickSight Developer Guide Use the following sections to learn more about the asset bundle export API operations. StartAssetBundleExportJob Export jobs are configured with the StartAssetBundleExportJobRequest object. Export jobs are identified by an AssetBundleExportJobId that you provide when you create the new export job. This ID is unique while the job is running. After the job is completed, you can reuse this ID for another job. Export jobs include a list of QuickSight asset ARNs to be exported. You can choose to have all dependencies of the specified asset ARNs to be exported automatically with the rest of the job. For example, if you're creating a job to export a QuickSight dashboard, you can also choose to export the dashboard's theme, dataset, and data source. Developers can also choose to have all assets in a folder and its subfolders exported automatically to preserve folder hierarchy and folder memberships. Parent folders are considered to be dependencies of subfolders and are included in the export if the IncludeAllDependencies parameter is set to True. Assets in a folder are considered folder members and are included in the export if the IncludeFolderMembers parameter is set to ONE_LEVEL/RECURSE. When assets are exported directly, folder membership information can be preserved when the IncludeFolderMemberships parameter is set to True. To include the folder in the direct export, set IncludeAllDependencies to True. All export jobs run asynchronously after they are started. Poll the status of an export job with a DescribeAssetBundleExportJob call to know is the current status of the job. Callers must have read-only permissions for all of the resource types that are exported, including the optional dependencies that are included in the export job. In some cases, certain QuickSight assets contain anomalies that don't impact the end user's experience. By default, the StartAssetBundleExportJob API operates in Lenient mode, which ignores these anomalies. Callers can choose to enforce stricter validations with Strict mode during export. To do so, set the value of the optional StrictModeForAllResources parameter to "True”. The StartAssetBundleImportJob API operation follows the validation strategy that is defined in the exported bundle. To import an existing bundle with Lenient mode, run a new export job with the optional StrictModeForAllResources parameter set to "False". For more information about the StartAssetBundleExportJob operation, see StartAssetBundleExportJob in the QuickSight API Reference. Asset bundle export operations 57 Amazon QuickSight Developer Guide DescribeAssetBundleExportJob Use the DescribeAssetBundleExportJob operation to obtain the current status of an existing export job that's up to 14 days old. You can also use this operation to review a specified job's configuration. Export jobs that have succeeded return a download URL for the asset bundle file in their description. Failed export jobs return error information in their description. Poll |
amazon-quicksight-dg-019 | amazon-quicksight-dg.pdf | 19 | bundle with Lenient mode, run a new export job with the optional StrictModeForAllResources parameter set to "False". For more information about the StartAssetBundleExportJob operation, see StartAssetBundleExportJob in the QuickSight API Reference. Asset bundle export operations 57 Amazon QuickSight Developer Guide DescribeAssetBundleExportJob Use the DescribeAssetBundleExportJob operation to obtain the current status of an existing export job that's up to 14 days old. You can also use this operation to review a specified job's configuration. Export jobs that have succeeded return a download URL for the asset bundle file in their description. Failed export jobs return error information in their description. Poll this operation until the export job that you want the status of has succeeded or failed. For more information about the DescribeAssetBundleExportJob operation, see DescribeAssetBundleExportJob in the QuickSight API Reference. ListAssetBundleExportJobs Use the ListAssetBundleExportJobs operation to retrieve a list of all export jobs that were created in the last 14 days. Export jobs are listed in the order that they were started, starting with the most recently started job. To have multiple lists by this operation, you can choose to specify a maximum page size to be returned and use a pagination token. For more information about the ListAssetBundleImportJobs operation, see ListAssetBundleExportJobs in the QuickSight API Reference. Examples The following example uses a StartAssetBundleExportJob API call to create a CloudFormation JSON file with override parameters. # configure and start the export job aws quicksight start-asset-bundle-export-job --aws-account-id AWSACCOUNTID \ --asset-bundle-export-job-id JOBID \ --resource-arns '["arn:aws:quicksight:REGION:AWSACCOUNTID:dashboard/ RESOURCEID","arn:aws:quicksight:REGION:AWSACCOUNTID:analysis/RESOURCEID"]' \ --cloud-formation-override-property-configuration '{"Dashboards": [{"Arn": "arn:aws:quicksight:REGION:AWSACCOUNTID:dashboard/RESOURCEID","Properties": ["Name"]}]}' \ --include-all-dependencies \ --export-format CLOUDFORMATION_JSON # poll job description until status is success Asset bundle export operations 58 Amazon QuickSight Developer Guide aws quicksight describe-asset-bundle-export-job --aws-account-id AWSACCOUNTID \ --asset-bundle-export-job-id JOBID # download the provided bundle (wget used here - any tool or browser works as well) wget -O ~/qs-bundle.qs 'https://the-long-url-from-your-job-description...' The following example uses a StartAssetBundleExportJob API call to create a QuickSight asset bundle file. # configure and start the export job aws quicksight start-asset-bundle-export-job --aws-account-id AWSACCOUNTID \ --asset-bundle-export-job-id JOBID \ --resource-arns '["arn:aws:quicksight:REGION:AWSACCOUNTID:dashboard/ RESOURCEID","arn:aws:quicksight:REGION:AWSACCOUNTID:analysis/RESOURCEID"]' \ --include-all-dependencies \ --export-format QUICKSIGHT_JSON # poll job description until status is success aws quicksight describe-asset-bundle-export-job --aws-account-id AWSACCOUNTID \ --asset-bundle-export-job-id JOBID # download the provided bundle (wget used here - any tool or browser works as well) wget -O ~/qs-bundle.qs 'https://the-long-url-from-your-job-description...' The following example uses a StartAssetBundleExportJob API call to include information for tags. By default, tag information is not exported. # Export in QuickSight format aws quicksight start-asset-bundle-export-job --aws-account-id AWSACCOUNTID \ --asset-bundle-export-job-id JOBID \ --resource-arns '["arn:aws:quicksight:REGION:AWSACCOUNTID:dashboard/ RESOURCEID","arn:aws:quicksight:REGION:AWSACCOUNTID:analysis/RESOURCEID"]' \ --include-all-dependencies \ --include-tags \ --export-format QUICKSIGHT_JSON Asset bundle export operations 59 Amazon QuickSight Developer Guide # Export in CloudFormation format, with optional tags overrides aws quicksight start-asset-bundle-export-job --aws-account-id AWSACCOUNTID \ --asset-bundle-export-job-id JOBID \ --resource-arns '["arn:aws:quicksight:REGION:AWSACCOUNTID:dashboard/ RESOURCEID","arn:aws:quicksight:REGION:AWSACCOUNTID:analysis/RESOURCEID"]' \ --include-all-dependencies \ --include-tags \ --export-format CLOUDFORMATION_JSON The following example uses a StartAssetBundleExportJob API call to include permissions information. By default, permission information is not exported. Permission overrides are not supported for the AWS CloudFormation format. To import permissions for a AWS CloudFormation format file, make sure that the source and target accounts are the same or have the same principal names for users, groups, and namespaces. # Export in QuickSight format aws quicksight start-asset-bundle-export-job --aws-account-id AWSACCOUNTID \ --asset-bundle-export-job-id JOBID \ --resource-arns '["arn:aws:quicksight:REGION:AWSACCOUNTID:dashboard/ RESOURCEID","arn:aws:quicksight:REGION:AWSACCOUNTID:analysis/RESOURCEID"]' \ --include-all-dependencies \ --include-permissions \ --export-format QUICKSIGHT_JSON # Export in CloudFormation format aws quicksight start-asset-bundle-export-job --aws-account-id AWSACCOUNTID \ --asset-bundle-export-job-id JOBID \ --resource-arns '["arn:aws:quicksight:REGION:AWSACCOUNTID:dashboard/ RESOURCEID","arn:aws:quicksight:REGION:AWSACCOUNTID:analysis/RESOURCEID"]' \ --include-all-dependencies \ --include-permissions \ --export-format CLOUDFORMATION_JSON The following example recusrively exports a folder along with all subfolders and the parent folder tree. Asset bundle export operations 60 Amazon QuickSight Developer Guide aws quicksight start-asset-bundle-export-job --aws-account-id AWSACCOUNTID \ --asset-bundle-export-job-id JOBID \ --resource-arns '["arn:aws:quicksight:REGION:AWSACCOUNTID:folder/RESOURCEID"]' \ --include-all-dependencies \ --include-folder-members RECURSE \ --export-format QUICKSIGHT_JSON" The following example runs a dashboard export job that preserves the dashboard's folder memberships. aws quicksight start-asset-bundle-export-job --aws-account-id AWSACCOUNTID \ --asset-bundle-export-job-id JOBID \ --resource-arns '["arn:aws:quicksight:REGION:AWSACCOUNTID:dashboard/RESOURCEID"]' \ --include-folder-memberships \ --export-format QUICKSIGHT_JSON The following example calls the StartAssetBundleExportJob API with Strict mode. aws quicksight start-asset-bundle-export-job --aws-account-id AWSACCOUNTID \ --asset-bundle-export-job-id JOBID \ --resource-arns '["arn:aws:quicksight:REGION:AWSACCOUNTID:dashboard/ RESOURCEID","arn:aws:quicksight:REGION:AWSACCOUNTID:analysis/RESOURCEID"]' \ --include-all-dependencies \ --export-format QUICKSIGHT_JSON Asset bundle import operations Use asset bundle import operations to import QuickSight assets from an QuickSight bundle file that's generated by an earlier export job. The following statements apply to asset bundle import operations. • You can only import QuickSight JSON bundles with the QuickSight asset bundle APIs. CloudFormation JSON files can only be imported using the AWS CloudFormation console or APIs. Both QuickSight JSON and CloudFormation JSON files support property value overrides. If you want to generate a QuickSight JSON file, property overrides are specified when you use an import API call. If you want to generate a CloudFormation JSON file, property overrides are Asset bundle import operations 61 Amazon QuickSight Developer Guide configured with the cloud-formation-override-property-configuration parameter when you create or update the CloudFormation stack. You can import files that were created from |
amazon-quicksight-dg-020 | amazon-quicksight-dg.pdf | 20 | You can only import QuickSight JSON bundles with the QuickSight asset bundle APIs. CloudFormation JSON files can only be imported using the AWS CloudFormation console or APIs. Both QuickSight JSON and CloudFormation JSON files support property value overrides. If you want to generate a QuickSight JSON file, property overrides are specified when you use an import API call. If you want to generate a CloudFormation JSON file, property overrides are Asset bundle import operations 61 Amazon QuickSight Developer Guide configured with the cloud-formation-override-property-configuration parameter when you create or update the CloudFormation stack. You can import files that were created from your account, or you can import asset bundle files that were generated from other QuickSight accounts. When you create a new import job, you can choose to provide overrides when you configure the import job. • The asset bundle import operations only support .qs format zip files. The .qs format file that contains the asset bundle that you want to import is in an Amazon S3 bucket or in a BASE64 encoded file that you can add to the import job directly. The S3 bucket exists in the same AWS account as your QuickSight account. • All import jobs run asynchronously after they are started. Poll the status of an import job with a DescribeAssetBundleImportJob API call to know the current status of the job. If an asset bundle import job fails, you can choose to have all assets that were successfully imported during the failed job rollback. Information about the error that caused the job to fail is returned in the job description of a DescribeAssetBundleImportJob API call. • All of an imported assets' dependencies must be present for an asset import job to succeed. You can include all dependencies of the asset when you export it. Alternatively, you can configure all dependencies in the QuickSight account that you want to move the asset into. For example, to import a dashboard, the dataset, data source, and theme that the dashboard uses must exist in the account that you're importing the asset into. The caller must have permissions to describe, create, and update all QuickSight resources located in the asset that you want to import. • After an import job succeeds, grant permissions to all users or user groups that need to access the newly created resource. If you want to override the properties of the QUICKSIGHT JSON format export, provide the new values when you start an import job. If you want to override properties in a CLOUDFORMATION JSON format export, provide the property names to override when you start an export job. Then and add the new values when the stack is created in the CloudFormation console or with the AWS CloudFormation APIs. Permissions are not propagated through the asset bundles. You can update asset permissions with an UpdateDashboardPermissions API call. Use the following sections to learn more about the asset bundle import API operations. StartAssetBundleImportJob Import jobs are configured with the StartAssetBundleImportJobRequest object. Asset bundle import operations 62 Amazon QuickSight Developer Guide Import jobs are identified by an AssetBundleImportJobId that you provide when you create the new import job. This ID is unique while the job is running. After the job is completed, you can reuse this ID for another job. Provide an Amazon S3 uri or a base64-encoded ZIP file to the request. If you use an Amazon S3 uri, the caller must have GetObject permissions. All assets contained in the file are imported into the target account. You can choose to configure override values to be applied to specific assets when they are imported. All imported data sources must have credential overrides. You can store asset credentials in AWS Secrets Manager or you can set a username and password directly into an override. If you use Secrets Manager, provide the secret ARN in the data source override. The caller must have GetSecretValue and DescribeSecret permissions to configure the Secrets Manager secret to the override. All import jobs run asynchronously after they are started. Poll the status of an export job with a DescribeAssetBundleImportJob call to know the current status of the job. Callers must have read and write permissions for all of the resource types that are exported, including the optional dependencies that are included in the export job. When an asset import job fails, you can choose to have all assets that were successfully imported during the failed job roll back automatically. If you don't choose to roll back the assets, successfully imported assets will still exist in the account that they are imported to. Information about the error that caused the job to fail is returned in the job description of a DescribeAssetBundleImportJob API call. For more information about the StartAssetBundleImportJob operation, see StartAssetBundleImportJob in the QuickSight API Reference. VPC overrides When you |
amazon-quicksight-dg-021 | amazon-quicksight-dg.pdf | 21 | that are exported, including the optional dependencies that are included in the export job. When an asset import job fails, you can choose to have all assets that were successfully imported during the failed job roll back automatically. If you don't choose to roll back the assets, successfully imported assets will still exist in the account that they are imported to. Information about the error that caused the job to fail is returned in the job description of a DescribeAssetBundleImportJob API call. For more information about the StartAssetBundleImportJob operation, see StartAssetBundleImportJob in the QuickSight API Reference. VPC overrides When you make a StartAssetBundleImportJob API call, provide an override parameter for the VPC connection that's configured to your QuickSight account. You can find the OverrideParameters value in the asset bundle file that was created when the asset was exported. The following example shows an OverrideParameters structure that uses the PrefixForAllResources value. "OverrideParameters": { "VPCConnections": [ { Asset bundle import operations 63 Amazon QuickSight Developer Guide "VPCConnectionId": "<PrefixForAllResources<VPCConnectionId in asset bundle file" "DnsResolvers": [ "string" ], "Name": "string", "RoleArn": "string", "SecurityGroupIds": [ "string" ], "SubnetIds": [ "string" ] } ] } For more information about setting up a VPC connection in QuickSight, see Configuring the VPC connection with the QuickSight CLI. Permissions The following statements apply to asset bundle import permissions. • Asset bundle import operations support up to 64 principals. • The final state of an asset bundle's permissions are determined by the following. • If the OverridePermissions parameter is provided in the input, all existing permissions are replaced by the permissions that are specified in the OverridePermissions parameter. • If the asset bundle was exported with permissions, all existing permissions are replaced by the permissions that are in the exported asset bundle's file. • If neither of the above conditions are met, no changes are made to the asset's permissions. • If the caller executes an asset bundle import job from a different account than the account that the asset bundle was exported from, there are differences in the the user, group, and namespace principal ARNs. When this happens, provide the correct ARN values in the OverridePermissions parameter. Tags The final state of an asset's tags are determined by the following. • If the OverrideTags parameter is provided in the API input, all existing tags are replaced by the tags that are specified in the OverrideTags parameter. • If the asset bundle file is exported with tags, all existing tags are replaced by the tags that are in the asset bundle's file. Asset bundle import operations 64 Amazon QuickSight Developer Guide • If neither of the above statements aren't met, no changes are made to the asset's tags. DescribeAssetBundleImportJob Use the DescribeAssetBundleImportJob operation to obtain the current status of an existing export job that's up to 14 days old. You can also use this operation to review a specified job's configuration. Failed import jobs return error information in their description. Poll this operation until the import job that you want the status of has succeeded or failed. For more information about the DescribeAssetBundleImportJob operation, see DescribeAssetBundleImportJob in the QuickSight API Reference. ListAssetBundleImportJobs Use the ListAssetBundleImportJobs operation to retrieve a list of all import jobs that were created in the last 14 days. Import jobs are listed in the order that they were started, starting with the most recently started job. If you expect to have multiple lists by this operation, you can choose to specify a maximum page size to be returned and use a pagination token. For more information about the ListAssetBundleImportJobs operation, see ListAssetBundleImportJobs in the QuickSight API Reference. Examples The following example creates an asset bundle import job for a file that is located in the caller's Amazon S3 bucket. # upload your bundle to an S3 bucket in your account aws s3 cp ~/qs-bundle.qs s3://bucket/key/qs-bundle.qs aws quicksight start-asset-bundle-import-job --aws-account-id AWSACCOUNTID \ --asset-bundle-import-job-id JOBID \ --asset-bundle-import-source '{"S3Uri": "s3://bucket/key/qs-bundle.qs"}' \ --failure-action ROLLBACK # poll job description until status is success (or failed) aws quicksight describe-asset-bundle-import-job --aws-account-id AWSACCOUNTID \ --asset-bundle-import-job-id JOBID Asset bundle import operations 65 Amazon QuickSight Developer Guide # grant yourself or others permissions to view/modify the imported resources (for more information, see UpdateDashboardPermissions in the Amazon QuickSight API Reference) # open your QuickSight site in your browser and confirm the imported resources (important) The following example creates an asset bundle import job with a bundle file that's uploaded directly. This example also uses data source credential overrides. aws quicksight start-asset-bundle-import-job --aws-account-id AWSACCOUNTID \ --asset-bundle-import-job-id JOBID \ --asset-bundle-import-source-bytes fileb://~/qs-bundle.qs \ --asset-bundle-import-source-bytes fileb://~/qs-bundle.qs \ --override-parameters '{"DataSources": [{"DataSourceId": "some-data-source-id", "Credentials": {"CredentialPair": {"Username": "some-username", "Password": "some- password"}}}]}' \ --failure-action ROLLBACK # poll job description until status is success (or failed) aws quicksight describe-asset-bundle-import-job --aws-account-id AWSACCOUNTID \ --asset-bundle-import-job-id JOBID # grant yourself or others permissions to view/modify the imported resources (for |
amazon-quicksight-dg-022 | amazon-quicksight-dg.pdf | 22 | QuickSight API Reference) # open your QuickSight site in your browser and confirm the imported resources (important) The following example creates an asset bundle import job with a bundle file that's uploaded directly. This example also uses data source credential overrides. aws quicksight start-asset-bundle-import-job --aws-account-id AWSACCOUNTID \ --asset-bundle-import-job-id JOBID \ --asset-bundle-import-source-bytes fileb://~/qs-bundle.qs \ --asset-bundle-import-source-bytes fileb://~/qs-bundle.qs \ --override-parameters '{"DataSources": [{"DataSourceId": "some-data-source-id", "Credentials": {"CredentialPair": {"Username": "some-username", "Password": "some- password"}}}]}' \ --failure-action ROLLBACK # poll job description until status is success (or failed) aws quicksight describe-asset-bundle-import-job --aws-account-id AWSACCOUNTID \ --asset-bundle-import-job-id JOBID # grant yourself or others permissions to view/modify the imported resources (for more information, see UpdateDashboardPermissions in the Amazon QuickSight API Reference) # open your QuickSight site in your browser and confirm the imported resources (important) The Override parameters also accept local files, as shown in the example below. --override-parameters file://import-override-parameter-prod.json \ --override-permissions file://import-override-permission-prod.json \ --override-tags file://import-override-tags-prod.json \ If callers want to assign different permissions to exported assets, they can provide an override object at import. There are two ways that this can be done. • Explicitly specify the resource IDs. If a prefix ID is specified, include the prefix in the resource ID. • Use the wildcard "*" to represent all resources of a specific type in the asset bundle files. Asset bundle import operations 66 Amazon QuickSight Developer Guide In the example below, all dashboards that are included in the asset bundle file are imported with specified permissions. // import-override-permission-prod.json { "DataSources": [ { "DataSourceIds": ["DATASOURCEID"], "Permissions": { "Principals": ["arn:aws:quicksight:REGION:AWSACCOUNTID:user/ default/USERIR"], "Actions": [ "quicksight:UpdateDataSourcePermissions", "quicksight:DescribeDataSourcePermissions", "quicksight:PassDataSource", "quicksight:DescribeDataSource", "quicksight:DeleteDataSource", "quicksight:UpdateDataSource" ] } } ], "DataSets": [ { "DataSetIds": ["DATASETID"], "Permissions": { "Principals": ["arn:aws:quicksight:REGION:AWSACCOUNTID:user/ default/USERIR"], "Actions": [ "quicksight:DeleteDataSet", "quicksight:UpdateDataSetPermissions", "quicksight:PutDataSetRefreshProperties", "quicksight:CreateRefreshSchedule", "quicksight:CancelIngestion", "quicksight:PassDataSet", "quicksight:ListRefreshSchedules", "quicksight:UpdateRefreshSchedule", "quicksight:DeleteRefreshSchedule", "quicksight:DescribeDataSetRefreshProperties", "quicksight:DescribeDataSet", "quicksight:CreateIngestion", "quicksight:DescribeRefreshSchedule", "quicksight:ListIngestions", Asset bundle import operations 67 Amazon QuickSight Developer Guide "quicksight:DescribeDataSetPermissions", "quicksight:UpdateDataSet", "quicksight:DeleteDataSetRefreshProperties", "quicksight:DescribeIngestion" ] } } ], "Themes": [ { "ThemeIds": ["THEMEID"], "Permissions": { "Principals": ["arn:aws:quicksight:REGION:AWSACCOUNTID:user/ default/USERIR"], "Actions": [ "quicksight:ListThemeVersions", "quicksight:UpdateThemeAlias", "quicksight:DescribeThemeAlias", "quicksight:UpdateThemePermissions", "quicksight:DeleteThemeAlias", "quicksight:DeleteTheme", "quicksight:ListThemeAliases", "quicksight:DescribeTheme", "quicksight:CreateThemeAlias", "quicksight:UpdateTheme", "quicksight:DescribeThemePermissions" ] } } ], "Analyses": [ { "AnalysisIds": ["ANALYSISIDS"], "Permissions": { "Principals": ["arn:aws:quicksight:REGION:AWSACCOUNTID:user/ default/USERIR"], "Actions": [ "quicksight:RestoreAnalysis", "quicksight:UpdateAnalysisPermissions", "quicksight:DeleteAnalysis", "quicksight:DescribeAnalysisPermissions", "quicksight:QueryAnalysis", "quicksight:DescribeAnalysis", "quicksight:UpdateAnalysis" Asset bundle import operations 68 Amazon QuickSight ] } } ], "Dashboards": [ { Developer Guide "DashboardIds": ["*"], "Permissions": { "Principals": ["arn:aws:quicksight:REGION:AWSACCOUNTID:user/ default/USERIR"], "Actions": [ "quicksight:DescribeDashboard", "quicksight:ListDashboardVersions", "quicksight:UpdateDashboardPermissions", "quicksight:QueryDashboard", "quicksight:UpdateDashboard", "quicksight:DeleteDashboard", "quicksight:DescribeDashboardPermissions", "quicksight:UpdateDashboardPublishedVersion" ] } } ] } If callers want to assign different tags to imported assets, they can provide an override object at import. There are two ways that this can be done. • Explicitly specify the resource IDs. If a prefix ID is specified, include the prefix in the resource ID. • Use the wildcard "*" to represent all resources of a specific type in the asset bundle files. In the example below, all dashboards that are included in the asset bundle file are imported with specified tags. // import-override-tags-prod.json { "DataSources": [ { "DataSourceIds": ["DATASOURCEID"], "Tags": [ { Asset bundle import operations 69 Amazon QuickSight Developer Guide "Key": "tagkey_datasource", "Value": "tagvalue_datasource" }, { "Key": "tagkey2_datasource", "Value": "tagvalue2_datasource" } ] } ], "DataSets": [ { "DataSetIds": ["*"], "Tags": [ { "Key": "tagkey_dataset", "Value": "tagvalue_dataset" }, { "Key": "tagkey2_dataset", "Value": "tagvalue2_dataset" } ] } ], "Themes": [ { "ThemeIds": ["*"], "Tags": [ { "Key": "tagkey_theme", "Value": "tagvalue_theme" }, { "Key": "tagkey2_theme", "Value": "tagvalue2_theme" } ] } ], "Analyses": [ { "AnalysisIds": ["*"], "Tags": [ Asset bundle import operations 70 Developer Guide Amazon QuickSight { "Key": "tagkey_analysis", "Value": "tagvalue_analysis" }, { "Key": "tagkey2_analysis", "Value": "tagvalue2_analysis" } ] } ], "Dashboards": [ { "DashboardIds": ["*"], "Tags": [ { "Key": "tagkey_dashboard", "Value": "tagvalue_dashboard" }, { "Key": "tagkey2_dashboard", "Value": "tagvalue2_dashboard" } ] } ] } If you want to import an asset bundle file with Strict mode, use the OverrideValidationStrategy parameter and set StrictModeForAllResources to True. The following example calls the StartAssetBundleImportJob API with Strict mode. aws quicksight start-asset-bundle-import-job --aws-account-id AWSACCOUNTID \ --asset-bundle-import-job-id JOBID \ --asset-bundle-import-source-bytes fileb://~/qs-bundle.qs \ --override-validation-strategy '{"StrictModeForAllResources":true}' Dashboard operations With dashboard API operations, you can perform actions on Amazon QuickSight dashboards. For more information, see the following API operations. Dashboard operations 71 Developer Guide Amazon QuickSight Topics • Dashboard permissions • CreateDashboard • DeleteDashboard • DescribeDashboard • ListDashboards • ListDashboardVersions • SearchDashboards • UpdateDashboard • UpdateDashboardPublishedVersion Dashboard permissions With dashboard permissions API operations, you can view and update permissions for dashboards. For more information, see the following API operations. Topics • DescribeDashboardPermissions • UpdateDashboardPermissions DescribeDashboardPermissions Use the DescribeDashboardPermissions API operation to view the read and write permissions for a dashboard. To use this operation, you need the ID of the dashboard whose permissions you want to view. The dashboard ID is part of the dashboard URL in QuickSight. You can also use the ListDashboards API operation to get the ID. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight describe-dashboard-permissions --aws-account-id 555555555555 |
amazon-quicksight-dg-023 | amazon-quicksight-dg.pdf | 23 | permissions With dashboard permissions API operations, you can view and update permissions for dashboards. For more information, see the following API operations. Topics • DescribeDashboardPermissions • UpdateDashboardPermissions DescribeDashboardPermissions Use the DescribeDashboardPermissions API operation to view the read and write permissions for a dashboard. To use this operation, you need the ID of the dashboard whose permissions you want to view. The dashboard ID is part of the dashboard URL in QuickSight. You can also use the ListDashboards API operation to get the ID. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight describe-dashboard-permissions --aws-account-id 555555555555 --dashboard-id 111122223333 Dashboard permissions 72 Amazon QuickSight Developer Guide For more information about the DescribeDashboardPermissions API operation, see DescribeDashboardPermissions in the Amazon QuickSight API Reference. UpdateDashboardPermissions Use the UpdateDashboardPermissions API operation to update read and write permissions for a dashboard. You can grant or revoke permissions in the same command. To use this operation, you need the ID of the dashboard whose permissions you want to update. The dashboard ID is part of the dashboard URL in QuickSight. You can also use the ListDashboards API operation to get the ID. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight update-dashboard-permissions --aws-account-id 555555555555 --dashboard-id DASHBOARDID --grant-permissions Principal=arn:aws:quicksight:us-east-1:555555555555:user/ default/ USERNAME,Actions=quicksight:DescribeDashboard,quicksight:QueryDashboard,quicksight:ListDashboardVersions --revoke-permissions Principal=arn:aws:quicksight:us-east-1:555555555555:user/ default/ USERNAME,Actions=quicksight:DescribeDashboard,quicksight:QueryDashboard,quicksight:ListDashboardVersions If your region has already been configured within the CLI, it doesn't need to be included as an argument. You can also make this command using a CLI skeleton file with the following command. For more information about CLI skeleton files, see Use CLI skeleton files. aws quicksight update-dashboard-permisisons --cli-input-json file://updatedashboardpermissions.json If your region has already been configured with the CLI, it does not need to be included in an argument. For more information about the UpdateDashboardPermissions API operation, see UpdateDashboardPermissions in the Amazon QuickSight API Reference. Dashboard permissions 73 Amazon QuickSight CreateDashboard Developer Guide Use the CreateDashboard API operation to create a dashboard. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight create-dashboard --aws-account-id 555555555555 --dashboard-id newDash --name Dashboard1 --source-entity '{"SourceTemplate":{"DataSetReferences": [{"DataSetPlaceholder":"PLACEHOLDER","DataSetArn":"arn:aws:quicksight:REGION:555555555555:dataset/ DATASETID"}],"Arn":"arn:aws:quicksight:REGION:555555555555:template/TEMPLATEID"}}' If your region has already been configured within the CLI, it doesn't need to be included as an argument. You can also make this command using a CLI skeleton file with the following command. For more information about CLI skeleton files, see Use CLI skeleton files. aws quicksight create-dashbord --cli-input-json file://createdashboard.json If your region has already been configured with the CLI, it does not need to be included in an argument. For more information about the CreateDashboard API operation, see CreateDashboard in the Amazon QuickSight API Reference. DeleteDashboard Use the DeleteDashboard API operation to delete a dashboard. To use this operation, you need the ID of the dashboard that you want to delete. The dashboard ID is part of the dashboard URL in QuickSight. You can also use the ListDashboards API operation to get the ID. You can add a VersionNumber parameter to this operation to only delete the specified version of the dashboard. CreateDashboard 74 Amazon QuickSight Developer Guide Following is an example AWS CLI command for this operation. AWS CLI aws quicksight delete-dashboard --aws-account-id 555555555555 --dashboard-id DASHBOARDID For more information about the DeleteDashboard API operation, see DeleteDashboard in the Amazon QuickSight API Reference. DescribeDashboard Use the DescribeDashboard API operation to view the summary of a dashboard. To use this operation, you need the ID of the dashboard that you want to view. The dashboard ID is part of the dashboard URL in QuickSight. You can also use the ListDashboards API operation to get the ID. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight describe-dashboard --aws-account-id 555555555555 --dashboard-id DASHBOARDID For more information about the DescribeDashboard API operation, see DescribeDashboard in the Amazon QuickSight API Reference. ListDashboards Use the ListDashboards API operation to list dashboards in an AWS account. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight list-dashboards --aws-account-id 555555555555 --page-size 10 DescribeDashboard 75 Amazon QuickSight --max-items 100 Developer Guide For more information about the ListDashboards API operation, see ListDashboards in the Amazon QuickSight API Reference. ListDashboardVersions Use the ListDashboardVersions API operation to list all the versions of a dashboard in an AWS account. To use this operation, you need the ID of the dashboard that you want to update. The dashboard ID is part of the dashboard URL in QuickSight. You can also use the ListDashboards API operation to get the ID. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight list-dashboard-versions --aws-account-id AWSACCOUNTID --dashboard-id DASHBOARD --page-size 10 --max-items 100 For more information about the ListDashboardVersions API operation, see ListDashboardVersions in the Amazon QuickSight API Reference. SearchDashboards Use the SearchDashboards API operation to search for dashboards in an AWS account. Following is an example AWS CLI |
amazon-quicksight-dg-024 | amazon-quicksight-dg.pdf | 24 | AWS account. To use this operation, you need the ID of the dashboard that you want to update. The dashboard ID is part of the dashboard URL in QuickSight. You can also use the ListDashboards API operation to get the ID. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight list-dashboard-versions --aws-account-id AWSACCOUNTID --dashboard-id DASHBOARD --page-size 10 --max-items 100 For more information about the ListDashboardVersions API operation, see ListDashboardVersions in the Amazon QuickSight API Reference. SearchDashboards Use the SearchDashboards API operation to search for dashboards in an AWS account. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight search-dashboards --aws-account-id 555555555555 --filters Operator=StringEquals,Name=QUICKSIGHT_USER,Value=arn:aws:quicksight:us- east-1:555555555555:user/default/USERNAME --page-size 10 --max-items 100 ListDashboardVersions 76 Amazon QuickSight Developer Guide If your region has already been configured within the CLI, it doesn't need to be included as an argument. If your region has already been configured with the CLI, it does not need to be included in an argument. For more information about the SearchDashboards API operation, see SearchDashboards in the Amazon QuickSight API Reference. UpdateDashboard Use the UpdateDashboard API operation to update a dashboard in an AWS account. To use this operation, you need the ID of the dashboard that you want to update. The dashboard ID is part of the dashboard URL in QuickSight. You can also use the ListDashboards API operation to get the ID. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight update-dashboard --aws-account-id 555555555555 --dashboard-id DASHBOARDID --name Dashboard --source-entity '{"SourceTemplate":{"DataSetReferences":[{"DataSetPlaceholder": "PLACEHOLDER","DataSetArn": "arn:aws:quicksight:<region>:<awsaccountid>:dataset/ <datasetid>"}],"Arn": "arn:aws:quicksight:<region>:<awsaccountid>:template/ <templateid>"}}' --version-description VERSION --dashboard-publish-options AdHocFilteringOption={AvailabilityStatus=ENABLED},ExportToCSVOption={AvailabilityStatus=ENABLED},SheetControlsOption={VisibilityState=EXPANDED} / --theme-arn THEMEARN If your region has already been configured within the CLI, it doesn't need to be included as an argument. You can also make this command using a CLI skeleton file with the following command. For more information about CLI skeleton files, see Use CLI skeleton files. aws quicksight update-dashboard UpdateDashboard 77 Amazon QuickSight Developer Guide --cli-input-json file://updatedashboard.json If your region has already been configured with the CLI, it does not need to be included in an argument. For more information about the UpdateDashboard API operation, see UpdateDashboard in the Amazon QuickSight API Reference. UpdateDashboardPublishedVersion Use the UpdateDashboardPublishedVersion API operation to update the published version of a dashboard. To use this operation, you need the ID of the published dashboard that you want to update. The dashboard ID is part of the dashboard URL in QuickSight. You can also use the ListDashboards API operation to get the ID. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight update-dashboard-published-version --aws-account-id 555555555555 --dashboard-id DASHBOARDID --dashboard-version-number VERSION You can also make this command using a CLI skeleton file with the following command. For more information about CLI skeleton files, see Use CLI skeleton files. aws quicksight update-dashboard-published-version --cli-input-json file://updatedashboardpublishedversion.json For more information about the UpdateDashboardPublishedVersion API operation, see UpdateDashboardPublishedVersion in the Amazon QuickSight API Reference. Data source operations With data source operations, you can perform actions on data sources. For more information, see the following API operations. UpdateDashboardPublishedVersion 78 Developer Guide Amazon QuickSight Topics • Data source permissions • CreateDataSource • DeleteDataSource • DescribeDataSource • ListDataSources • UpdateDataSource Data source permissions With data source permissions API operations, you can view and update permissions for a data source. For more information, see the following API operations. Topics • DescribeDataSourcePermissions • UpdateDataSourcePermissions DescribeDataSourcePermissions Use the DescribeDataSourcePermissions API operation to describe the resource permissions for a data source. To use this operation, you need the ID of the data source whose permissions you want to view. The data source ID is part of the data source URL in QuickSight. You can also use the ListDataSources API operation to get the ID. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight describe-data-source-permissions --aws-account-id AWSACCOUNTID --data-source-id DATASOURCEID For more information about the DescribeDataSourcePermissions API operation, see DescribeDataSourcePermissions in the Amazon QuickSight API Reference. Data source permissions 79 Amazon QuickSight Developer Guide UpdateDataSourcePermissions Use the UpdateDataSourcePermissions API operation to update the resource permissions for a data source. You can grant or revoke permissions in the same command. To use this operation, you need the ID of the data source whose permissions you want to update. The data source ID is part of the data source URL in QuickSight. You can also use the ListDataSources API operation to get the ID. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight update-data-source-permissions --aws-account-id AWSACCOUNTID --data-source-id DATASOURCEID --grant-permissions Principal=arn:aws:quicksight:us-east-1:AWSACCOUNTID:user/ default/ USERNAME,Actions=quicksight:DescribeDataSource,quicksight:DescribeDataSourcePermissions,quicksight:PassDataSource --revoke-permissions Principal=arn:aws:quicksight:us-east-1:AWSACCOUNTID:user/ default/ USERNAME,Actions=quicksight:DescribeDataSource,quicksight:DescribeDataSourcePermissions,quicksight:PassDataSource If your region has already been configured within the CLI, it doesn't need to be included as an argument. You can also make this command using a CLI skeleton file with the following command. For more information about CLI skeleton files, see Use CLI skeleton files. |
amazon-quicksight-dg-025 | amazon-quicksight-dg.pdf | 25 | to update. The data source ID is part of the data source URL in QuickSight. You can also use the ListDataSources API operation to get the ID. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight update-data-source-permissions --aws-account-id AWSACCOUNTID --data-source-id DATASOURCEID --grant-permissions Principal=arn:aws:quicksight:us-east-1:AWSACCOUNTID:user/ default/ USERNAME,Actions=quicksight:DescribeDataSource,quicksight:DescribeDataSourcePermissions,quicksight:PassDataSource --revoke-permissions Principal=arn:aws:quicksight:us-east-1:AWSACCOUNTID:user/ default/ USERNAME,Actions=quicksight:DescribeDataSource,quicksight:DescribeDataSourcePermissions,quicksight:PassDataSource If your region has already been configured within the CLI, it doesn't need to be included as an argument. You can also make this command using a CLI skeleton file with the following command. For more information about CLI skeleton files, see Use CLI skeleton files. aws quicksight update-data-source-permissions --cli-input-json file://updatedatasourcepermissions.json If your region has already been configured with the CLI, it does not need to be included in an argument. For more information about the UpdateDataSourcePermissions API operation, see UpdateDataSourcePermissions in the Amazon QuickSight API Reference. Data source permissions 80 Amazon QuickSight CreateDataSource Developer Guide Use the CreateDataSource API operation to create a data source. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight create-data-source --aws-account-id AWSACCOUNTID --data-source-id DATASOURCEID --name NAME --type ATHENA You can also make this command using a CLI skeleton file with the following command. For more information about CLI skeleton files, see Use CLI skeleton files. aws quicksight create-data-source --cli-input-json file://createdatasource.json For more information about the CreateDataSource API operation, see CreateDataSource in the Amazon QuickSight API Reference. DeleteDataSource Use the DeleteDataSource API operation to permanently delete a data source from Amazon QuickSight. To use this operation, you need the ID of the data source that you want to delete. The data source ID is part of the data source URL in QuickSight. You can also use the ListDataSources API operation to get the ID. Note Deleting a data source breaks all datasets that reference it. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight delete-data-source CreateDataSource 81 Amazon QuickSight Developer Guide --aws-account-id AWSACCOUNTID --data-source-id DATASOURCEID For more information about the DeleteDataSource API operation, see DeleteDataSource in the Amazon QuickSight API Reference. DescribeDataSource Use the DescribeDataSource API operation to describe a data source. To use this operation, you need the ID of the data source that you want to view. The data source ID is part of the data source URL in QuickSight. You can also use the ListDataSources API operation to get the ID. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight describe-data-source --aws-account-id AWSACCOUNTID --data-source-id DATASOURCEID For more information about the DescribeDataSource API operation, see DescribeDataSource in the Amazon QuickSight API Reference. ListDataSources Use the ListDataSources API operation to list all data sources in the current AWS Region that belong to a particular AWS account. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight list-data-sources --aws-account-id AWSACCOUNTID --page-size 10 --max-items 100 For more information about the ListDataSources API operation, see ListDataSources in the Amazon QuickSight API Reference. DescribeDataSource 82 Amazon QuickSight UpdateDataSource Developer Guide Use the UpdateDataSource API operation to update a data source. To use this operation, you need the ID of the data source that you want to update. The data source ID is part of the data source URL in QuickSight. You can also use the ListDataSources API operation to get the ID. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight update-data-source --aws-account-id AWSACCOUNTID --data-source-id DATASOURCEID You can also make this command using a CLI skeleton file with the following command. For more information about CLI skeleton files, see Use CLI skeleton files. aws quicksight update-data-source --cli-input-json file://updatedatasource.json For more information about the UpdateDataSource API operation, see UpdateDataSource in the Amazon QuickSight API Reference. Dataset operations With dataset operations, you can perform actions on Amazon QuickSight datasets. For more information, see the following API operations. Topics • Dataset permissions operations • CreateDataSet • DeleteDataSet • DescribeDataSet • ListDataSets • UpdateDataSet UpdateDataSource 83 Amazon QuickSight Developer Guide Dataset permissions operations With dataset permissions API operations, you can view and update permissions on a dataset. For more information, see the following API operations. Topics • DescribeDataSetPermissions • UpdateDataSetPermissions DescribeDataSetPermissions Use the DescribeDataSetPermissions API operation to describe the permissions on a dataset. To use this operation, you need the ID of the dataset whose permissions that you want to view. The dataset ID is part of the dataset URL in QuickSight. You can also use the ListDataSets API operation to get the ID. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight describe-data-set-permissions --aws-account-id AWSACCOUNTID --data-set-id DATASETID For more information about the DescribeDataSetPermissions API operation, see DescribeDataSetPermissions in the Amazon QuickSight API Reference. UpdateDataSetPermissions Use the UpdateDataSetPermissions API operation to update the permissions on a dataset. You can grant or revoke permissions in the same command. To use |
amazon-quicksight-dg-026 | amazon-quicksight-dg.pdf | 26 | use this operation, you need the ID of the dataset whose permissions that you want to view. The dataset ID is part of the dataset URL in QuickSight. You can also use the ListDataSets API operation to get the ID. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight describe-data-set-permissions --aws-account-id AWSACCOUNTID --data-set-id DATASETID For more information about the DescribeDataSetPermissions API operation, see DescribeDataSetPermissions in the Amazon QuickSight API Reference. UpdateDataSetPermissions Use the UpdateDataSetPermissions API operation to update the permissions on a dataset. You can grant or revoke permissions in the same command. To use this operation, you need the ID of the dataset whose permissions that you want to update. The dataset ID is part of the dataset URL in QuickSight. You can also use the ListDataSets API operation to get the ID. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight update-data-set-permissions Dataset permissions operations 84 Amazon QuickSight Developer Guide --aws-account-id AWSACCOUNTID --data-set-id DATASETID --grant-permissions Principal=arn:aws:quicksight:us-east-1:AWSACCOUNTID:user/ default/ USERNAME,Actions=quicksight:DescribeDataSet,quicksight:DescribeDataSetPermissions,quicksight:PassDataSet,quicksight:DescribeIngestion,quicksight:ListIngestions --revoke-permissions Principal=arn:aws:quicksight:us-east-1:AWSACCOUNTID:user/ default/ USERNAME,Actions=quicksight:DescribeDataSet,quicksight:DescribeDataSetPermissions,quicksight:PassDataSet,quicksight:DescribeIngestion,quicksight:ListIngestions If your region has already been configured with the CLI, it doesn't need to be included as an argument. You can also make this command using a CLI skeleton file with the following command. For more information about CLI skeleton files, see Use CLI skeleton files. aws quicksight update-data-set-permissions --cli-input-json file://updatedatasetpermissions.json If your region has already been configured with the CLI, it does not need to be included in an argument. For more information about the UpdateDataSetPermissions API operation, see UpdateDataSetPermissions in the Amazon QuickSight API Reference. CreateDataSet Use the CreateDataSet API operation to create a dataset. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight create-data-set --aws-account-id AWSACCOUNTID --data-set-id DATASETID --name NAME --physical-table-map '{"PhysicalTableMap":{"string":{"CustomSql":{"Columns": [{"Name":"string","Type":"string"}],"DataSourceArn":"string","Name":"string","SqlQuery":"string"},"RelationalTable": {"Catalog":"string","DataSourceArn":"string","InputColumns": [{"Name":"string","Type":"string"}],"Name":"string","Schema":"string"},"S3Source": {"DataSourceArn":"string","InputColumns": CreateDataSet 85 Amazon QuickSight Developer Guide [{"Name":"string","Type":"string"}],"UploadSettings": {"ContainsHeader":boolean,"Delimiter":"string","Format":"string","StartFromRow":number,"TextQualifier":"string"}}}}' --import-mode DIRECT_QUERY You can also make this command using a CLI skeleton file with the following command. For more information about CLI skeleton files, see Use CLI skeleton files. aws quicksight create-data-set --cli-input-json file://createdataset.json For more information about the CreateDataSet API operation, see CreateDataSet in the Amazon QuickSight API Reference. DeleteDataSet Use the DeleteDataSet API operation to delete a dataset. To use this operation, you need the ID of the dataset that you want to delete. The dataset ID is part of the dataset URL in QuickSight. You can also use the ListDataSets API operation to get the ID. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight delete-data-set --aws-account-id AWSACCOUNTID --data-set-id DATASETID For more information about the DeleteDataSet API operation, see DeleteDataSet in the Amazon QuickSight API Reference. DescribeDataSet Use the DescribeDataSet API operation to describe a dataset. To use this operation, you need the ID of the dataset that you want to describe. The dataset ID is part of the dataset URL in QuickSight. You can also use the ListDataSets API operation to get the ID. Following is an example AWS CLI command for this operation. DeleteDataSet 86 Amazon QuickSight AWS CLI aws quicksight describe-data-set --aws-account-id AWSACCOUNTID --data-set-id DATASETID Developer Guide For more information about the DescribeDataSet API operation, see DescribeDataSet in the Amazon QuickSight API Reference. ListDataSets Use the ListDataSets API operation to list all of the datasets that belong to a particular AWS account in an AWS Region. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight list-data-sets --aws-account-id AWSACCOUNTID --page-size 10 --max-items 100 For more information about the ListDataSets API operation, see ListDataSets in the Amazon QuickSight API Reference. UpdateDataSet Use the UpdateDataSet API operation to update a dataset. To use this operation, you need the ID of the dataset that you want to update. The dataset ID is part of the dataset URL in QuickSight. You can also use the ListDataSets API operation to get the ID. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight update-data-set --aws-account-id AWSACCOUNTID --data-set-id DATASETID --name NAME ListDataSets 87 Amazon QuickSight Developer Guide --physical-table-map PHYSICALTABLEMAP --import-mode IMPORTMODE You can also make this command using a CLI skeleton file with the following command. For more information about CLI skeleton files, see Use CLI skeleton files. aws quicksight update-data-set --cli-input-json file://updatedataset.json For more information about the UpdateDataSet API operation, see UpdateDataSet in the Amazon QuickSight API Reference. Folder operations In QuickSight Enterprise Edition, you can create personal and shared folders to add hierarchical structure to QuickSight asset management. Using folders, people can more easily organize, navigate through, and discover dashboards, analyses, and datasets. Within a folder, you can still use your usual tools to search for assets or to add assets to your favorites list. For more information about folders, see Organizing Assets into Folders for QuickSight in the Amazon QuickSight User Guide. Using the AWS CLI, you can |
amazon-quicksight-dg-027 | amazon-quicksight-dg.pdf | 27 | --cli-input-json file://updatedataset.json For more information about the UpdateDataSet API operation, see UpdateDataSet in the Amazon QuickSight API Reference. Folder operations In QuickSight Enterprise Edition, you can create personal and shared folders to add hierarchical structure to QuickSight asset management. Using folders, people can more easily organize, navigate through, and discover dashboards, analyses, and datasets. Within a folder, you can still use your usual tools to search for assets or to add assets to your favorites list. For more information about folders, see Organizing Assets into Folders for QuickSight in the Amazon QuickSight User Guide. Using the AWS CLI, you can use the following operations to create, search, update, and delete folders in your QuickSight account: Topics • Folder membership operations • Folder permissions operations • CreateFolder • DeleteFolder • DescribeFolder • ListFolders • SearchFolders • UpdateFolder Folder operations 88 Amazon QuickSight Developer Guide Folder membership operations With folder membership API operations, you can view and update assets, such as a dashboard, analysis, or dataset, to a folder. For more information, see the following API operations: • CreateFolderMembership • DeleteFolderMembership • ListFolderMembers CreateFolderMembership Use the CreateFolderMembership to add an asset, such as a dashboard, analysis, or dataset, to a folder. To use this operation, you need the member ID of the asset that you want to add to a folder. The member ID is either the dashboard, analysis, or dataset ID of the analysis, dashboard, or dataset that you want to add to a folder. The member ID is part of the analysis, dashboard, or dataset URL in QuickSight. You can also use the ListAnalyses, ListDashboards, or ListDataSets operations to get the ID. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight create-folder-membership --aws-account-id AWSACCOUNTID --folder-id FOLDERID --member-id 444455556666 --member-type DASHBOARD You can also make this command using a CLI skeleton file with the following command. For more information about CLI skeleton files, see Use CLI skeleton files. aws quicksight create-folder-membership --cli-input-json file://createfoldermembership.json For more information about the CreateFolderMembership operation, see CreateFolderMembership in the QuickSight API Reference. Folder membership operations 89 Amazon QuickSight DeleteFolderMembership Developer Guide Use the DeleteFolderMembership to delete an asset, such as a dashboard, analysis, or dataset, from a folder. To use this operation, you need the member ID of the asset that you want to add to a folder. The member ID is either the dashboard, analysis, or dataset ID of the analysis, dashboard, or dataset that you want to add to a folder. The member ID is part of the analysis, dashboard, or dataset URL in QuickSight. You can also use the ListAnalyses, ListDashboards, or ListDataSets operations to get the ID. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight delete-folder-membership --aws-account-id AWSACCOUNTID --folder-id FOLDERID --member-id 444455556666 --member-type DASHBOARD For more information about the DeleteFolderMembership operation, see DeleteFolderMembership in the QuickSight API Reference. ListFolderMembers Use the ListFolderMembers operation to list all assets (DASHBOARD, ANALYSIS, and DATASET) that are in a folder. To use this operation, you need the ID of the folder whose permissions you want to view. The folder ID is part of the folder URL in QuickSight. You can also use the ListFolders operation to get the ID. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight list-folder-members --aws-account-id AWSACCOUNTID --folder-id FOLDERID --page-size 10 --max-items 100 Folder membership operations 90 Amazon QuickSight Developer Guide For more information about the ListFolderMembers operation, see ListFolderMembers in the QuickSight API Reference. Folder permissions operations With folder permission API operations, you can view and update permissions for folders. For more information, see the following API operations: • UpdateFolderPermissions • DescribeFolderPermissions • DescribeFolderResolvedPermissions DescribeFolderPermissions Use the DescribeFolderPermissions operation to describe the permissions of a folder. To use this operation, you need the ID of the folder whose permissions you want to view. The folder ID is part of the folder URL in QuickSight. You can also use the ListFolders operation to get the ID. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight describe-folder-permissions --aws-account-id AWSACCOUNTID --folder-id FOLDERID You can find the folder ID by using a ListFolders operation or through the URL in the QuickSight user interface. For more information about the DescribeFolderPermissions operation, see DescribeFolderPermissions in the QuickSight API Reference. DescribeFolderResolvedPermissions Use the DescribeFolderResolvedPermissions operation to describe the resolved permissions of a folder. Permissions consist of both folder direct permissions and the inherited permissions from the ancestor folders. To use this operation, you need the ID of the folder whose permissions Folder permissions operations 91 Amazon QuickSight Developer Guide you want to view. The folder ID is part of the folder URL in QuickSight. You can also use the ListFolders operation to get the ID. Following is an example AWS CLI command for this operation. AWS CLI |
amazon-quicksight-dg-028 | amazon-quicksight-dg.pdf | 28 | more information about the DescribeFolderPermissions operation, see DescribeFolderPermissions in the QuickSight API Reference. DescribeFolderResolvedPermissions Use the DescribeFolderResolvedPermissions operation to describe the resolved permissions of a folder. Permissions consist of both folder direct permissions and the inherited permissions from the ancestor folders. To use this operation, you need the ID of the folder whose permissions Folder permissions operations 91 Amazon QuickSight Developer Guide you want to view. The folder ID is part of the folder URL in QuickSight. You can also use the ListFolders operation to get the ID. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight describe-folder-resolved-permissions --aws-account-id AWSACCOUNTID --folder-id FOLDERID For more information about the DescribeFolderResolvedPermissions operation, see DescribeFolderResolvedPermissions in the QuickSight API Reference. UpdateFolderPermissions Use the UpdateFolderPermissions operation to update the permissions of a folder. You can grant or revoke permissions in the same command. To use this operation, you need the ID of the folder whose permissions you want to view. The folder ID is part of the folder URL in QuickSight. You can also use the ListFolders operation to get the ID. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight update-folder-permissions --aws-account-id AWSACCOUNTID --folder-id FOLDERID --grant-permissions Principal=arn:aws:quicksight:us-east-1:AWSACCOUNTID:user/ default/ USERNAME,Actions=quicksight:CreateFolder,quicksight:DescribeFolder,quicksight:UpdateFolder,quicksight:DeleteFolder,quicksight:CreateFolderMembership,quicksight:DeleteFolderMembership,quicksight:DescribeFolderPermissions,quicksight:UpdateFolderPermissions --revoke-permissions Principal=arn:aws:quicksight:us-east-1:AWSACCOUNTID:user/ default/ USERNAME,Actions=quicksight:CreateFolder,quicksight:DescribeFolder,quicksight:UpdateFolder,quicksight:DeleteFolder,quicksight:CreateFolderMembership,quicksight:DeleteFolderMembership,quicksight:DescribeFolderPermissions,quicksight:UpdateFolderPermissions If your region has already been configured within the CLI, it doesn't need to be included as an argument. You can also make this command using a CLI skeleton file with the following command. For more information about CLI skeleton files, see Use CLI skeleton files. aws quicksight update-folder-permissions Folder permissions operations 92 Amazon QuickSight Developer Guide --cli-input-json file://updatefolderpermissions.json If your region has already been configured with the CLI, it does not need to be included in an argument. For more information on the UpdateFolderPermissions operation, see UpdateFolderPermissions in the QuickSight API Reference. CreateFolder The CreateFolder operation creates an empty shared folder. To use this operation, you need the ID of the folder whose permissions you want to view. The folder ID is part of the folder URL in QuickSight. You can also use the ListFolders operation to get the ID. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight create-folder --aws-account-id AWSACCOUNTID --folder-id FOLDERID You can also make this command using a CLI skeleton file with the following command. For more information about CLI skeleton files, see Use CLI skeleton files. aws quicksight create-folder --cli-input-json file://createfolder.json For more information about the CreateFolder operation, see CreateFolder in the QuickSight API Reference. DeleteFolder Use the DeleteFolder operation to delete an empty folder. To use this operation, you need the ID of the folder whose permissions you want to view. The folder ID is part of the folder URL in QuickSight. You can also use the ListFolders operation to get the ID. Following is an example AWS CLI command for this operation. CreateFolder 93 Amazon QuickSight AWS CLI aws quicksight delete-folder --aws-account-id AWSACCOUNTID --folder-id FOLDERID Developer Guide For more information about the DeleteFolder operation, see DeleteFolder in the QuickSight API Reference. DescribeFolder Use the DescribeFolder operation to describe a folder. To use this operation, you need the ID of the folder whose permissions you want to view. The folder ID is part of the folder URL in QuickSight. You can also use the ListFolders operation to get the ID. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight describe-folder --aws-account-id AWSACCOUNTID --folder-id FOLDERID For more information about the DescribeFolder operation, see DescribeFolder in the QuickSight API Reference. ListFolders Use the ListFolders operation to list all folders in an QuickSight account. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight list-folders --aws-account-id AWSACCOUNTID --page-size 10 --max-items 100 DescribeFolder 94 Amazon QuickSight Developer Guide For more information about the ListFolders operation, see ListFolders in the QuickSight API Reference. SearchFolders Use the SearchFolders operation to search the subfolders of a folder. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight search-folders --aws-account-id AWSACCOUNTID --filters Operator=StringEquals,Name=QUICKSIGHT_USER,Value=arn:aws:quicksight:us- east-1:AWSACCOUNTID:user/default/USERNAME --max-results 100 If your region has already been configured within the CLI, it doesn't need to be included as an argument. If your region has already been configured with the CLI, it does not need to be included in an argument. For more information on the SearchFolders operation, see SearchFolders in the QuickSight API Reference. UpdateFolder Use the UpdateFolder operation to update the name of a folder. To use this operation, you need the ID of the folder whose permissions you want to view. The folder ID is part of the folder URL in QuickSight. You can also use the ListFolders operation to get the ID. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight update-folder --aws-account-id AWSACCOUNTID --folder-id FOLDERID SearchFolders 95 Amazon QuickSight --name |
amazon-quicksight-dg-029 | amazon-quicksight-dg.pdf | 29 | CLI, it does not need to be included in an argument. For more information on the SearchFolders operation, see SearchFolders in the QuickSight API Reference. UpdateFolder Use the UpdateFolder operation to update the name of a folder. To use this operation, you need the ID of the folder whose permissions you want to view. The folder ID is part of the folder URL in QuickSight. You can also use the ListFolders operation to get the ID. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight update-folder --aws-account-id AWSACCOUNTID --folder-id FOLDERID SearchFolders 95 Amazon QuickSight --name NAME Developer Guide You can also make this command using a CLI skeleton file with the following command. For more information about CLI skeleton files, see Use CLI skeleton files. aws quicksight update-folder --cli-input-json file://updatefolder.json For more information about the UpdateFolder operation, see UpdateFolder in the QuickSight API Reference. Group operations With group API operations, you can perform actions on groups. For more information, see the following API operations. Topics • Group membership operations • CreateGroup • DeleteGroup • DescribeGroup • ListGroups • SearchGroups • UpdateGroup Group membership operations With group membership API operations, you can view and update permissions for members in a group. For more information, see the following API operations. Topics • CreateGroupMembership • DeleteGroupMembership • DescribeGroupMembership • ListGroupMemberships Group operations 96 Amazon QuickSight CreateGroupMembership Developer Guide Use the CreateGroupMembership API operation to add an Amazon QuickSight user to a QuickSight group. You can find users in a certain group by calling the ListGroups API operation, and then the ListGroupMemberships API operation on the group of your choice. Following is an example AWS CLI command for this operation. In the following examples, the member USERNAME is added to the group GROUPNAME. AWS CLI aws quicksight create-group-membership --namespace default --aws-account-id AWSACCOUNTID --group-name GROUPNAME --member-name USERNAME You can also make this command using a CLI skeleton file with the following command. For more information about CLI skeleton files, see Use CLI skeleton files. aws quicksight create-group-membership --cli-input-json file://creategroupmembership.json For more information about the CreateGroupMembership API operation, see CreateGroupMembership in the Amazon QuickSight API Reference. DeleteGroupMembership Use the DeleteGroupMembership API operation to remove a user from a group so that the user is no longer a member of the group. You can find users in a certain group by calling the ListGroups API operation, and then the ListGroupMemberships operation on the group that you choose. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight delete-group-membership --member-name USERNAME Group membership operations 97 Amazon QuickSight Developer Guide --group-name GROUPNAME --aws-account-id AWSACCOUNTID --namespace NAMESPACE For more information about the DeleteGroupMembership API operation, see DeleteGroupMembership in the Amazon QuickSight API Reference. DescribeGroupMembership Use the DescribeGroupMembership API operation to determine if a user is a member of the specified group. If the user exists and is a member of the specified group, an associated GroupMember object is returned. Following is an example AWS CLI command for this operation. AWS CLI CLI Input: aws quicksight describe-group-membership --region us-west-2 --aws-account-id AWSACCOUNTID --namespace NAMESPACE --group-name Marketing-East --member-name MEMBERNAME For more information about the ListGroups API operation, see DescribeGroupMembership in the Amazon QuickSight API Reference. ListGroupMemberships Use the ListGroupMemberships API operation to list member users in a group. To view a list of user groups in Amazon QuickSight, call the ListGroups API operation. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight list-group-memberships --group-name GROUPNAME Group membership operations 98 Amazon QuickSight Developer Guide --max-results 100 --aws-account-id AWSACCOUNTID --namespace NAMESPACE For more information about the ListGroupMemberships API operation, see ListGroupMemberships in the Amazon QuickSight API Reference. CreateGroup Use the CreateGroup API operation to create a user group in Amazon QuickSight. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight create-group --namespace NAMESPACE --aws-account-id AWSACCOUNTID --group-name GROUPNAME You can also make this command using a CLI skeleton file with the following command. For more information about CLI skeleton files, see Use CLI skeleton files. aws quicksight create-group --cli-input-json file://creategroup.json For more information about the CreateGroup API operation, see CreateGroup in the Amazon QuickSight API Reference. DeleteGroup Use the DeleteGroup API operation to remove a user group from Amazon QuickSight. You can find a group name by calling the ListGroups API operation. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight delete-group --group-name GROUPNAME CreateGroup 99 Amazon QuickSight Developer Guide --aws-account-id AWSACCOUNTID --namespace default For more information about the DeleteGroup API operation, see DeleteGroup in the Amazon QuickSight API Reference. DescribeGroup Use the DescribeGroup API operation to view an Amazon QuickSight group's description and Amazon Resource Name (ARN). You can find a group name by calling the ListGroups API operation. Following is an example AWS CLI command for this operation. AWS CLI |
amazon-quicksight-dg-030 | amazon-quicksight-dg.pdf | 30 | from Amazon QuickSight. You can find a group name by calling the ListGroups API operation. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight delete-group --group-name GROUPNAME CreateGroup 99 Amazon QuickSight Developer Guide --aws-account-id AWSACCOUNTID --namespace default For more information about the DeleteGroup API operation, see DeleteGroup in the Amazon QuickSight API Reference. DescribeGroup Use the DescribeGroup API operation to view an Amazon QuickSight group's description and Amazon Resource Name (ARN). You can find a group name by calling the ListGroups API operation. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight describe-group --group-name GROUPNAME --aws-account-id AWSACCOUNTID --namespace default For more information about the DescribeGroup API operation, see DescribeGroup in the Amazon QuickSight API Reference. ListGroups Use the ListGroups API operation to list all user groups in Amazon QuickSight. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight list-groups --aws-account-id AWSACCOUNTID --max-results 100 --namespace default For more information about the ListGroups API operation, see ListGroups in the Amazon QuickSight API Reference. DescribeGroup 100 Amazon QuickSight SearchGroups Developer Guide Use the SearchGroups operation to search groups in a specified QuickSight namespace using the supplied filters. Following is an example AWS CLI command for this operation. AWS CLI CLI Input: aws quicksight search-groups --region us-west-2 --aws-account-id AWSACCOUNTID --namespace default --filters "[{\"Operator\": \"StringLike\", \"Name\": \"GROUP_NAME\", \"Value\": \"Mar\"}]" For more information about the SearchGroups API operation, see SearchGroups in the Amazon QuickSight API Reference. UpdateGroup Use the UpdateGroup API operation to change a group description. You can find a group name by calling the ListGroups API operation. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight update-group --group-name GROUPNAME --description "NEW DESCRIPTION" --aws-account-id AWSACCOUNTID --namespace default You can also make this command using a CLI skeleton file with the following command. For more information about CLI skeleton files, see Use CLI skeleton files. aws quicksight update-group SearchGroups 101 Amazon QuickSight Developer Guide --cli-input-json file://updategroup.json For more information about the UpdateGroup API operation, see UpdateGroup in the Amazon QuickSight API Reference. IAM policy assignment operations AWS Identity and Access Management (IAM) is an AWS service that helps an administrator to securely control access to AWS resources. Administrators control who can be authenticated (signed in) and authorized (have permissions) to use Amazon QuickSight resources. For more information about using QuickSight with IAM, see Using AWS Identity and Access Management (IAM) in the Amazon QuickSight User Guide. With IAM policy assignment operations, you can create, update, and delete IAM policy assignments. For more information, see the following API operations. Topics • CreateIAMPolicyAssignment • DeleteIAMPolicyAssignment • DescribeIAMPolicyAssignment • ListIAMPolicyAssignments • ListIAMPolicyAssignmentsForUser • UpdateIAMPolicyAssignment CreateIAMPolicyAssignment Use the CreateIAMPolicyAssignment API operation to create an assignment with one specified IAM policy, identified by its Amazon Resource Name (ARN). This policy assignment is attached to the specified groups or users of Amazon QuickSight. Assignment names are unique for each AWS account. To avoid overwriting rules in other namespaces, use assignment names that are unique. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight create-iam-policy-assignment --aws-account-id AWSACCOUNTID IAM policy assignment operations 102 Amazon QuickSight Developer Guide --assignment-name ASSIGNMENT --assignment-status ENABLED You can also make this command using a CLI skeleton file with the following command. For more information about CLI skeleton files, see Use CLI skeleton files. aws quicksight create-iam-policy-assignment --cli-input-json file://createiampolicyassignment.json For more information about the CreateIAMPolicyAssignment API operation, see CreateIAMPolicyAssignment in the Amazon QuickSight API Reference. DeleteIAMPolicyAssignment Use the DeleteIAMPolicyAssignment API operation to delete an existing IAM policy assignment. To find a policy assignment name, call the ListIAMPolicyAssignments or ListIAMPolicyAssignmentsForUser API operation. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight delete-iam-policy-assignment --aws-account-id AWSACCOUNTID --assignment-name ASSIGNMENT --namespace default For more information about the DeleteIAMPolicyAssignment API operation, see DeleteIAMPolicyAssignment in the Amazon QuickSight API Reference. DescribeIAMPolicyAssignment Use the DescribeIAMPolicyAssignment API operation to describe an existing IAM policy assignment. To find a policy assignment name, call the ListIAMPolicyAssignments or ListIAMPolicyAssignmentsForUser API operation. DeleteIAMPolicyAssignment 103 Amazon QuickSight Developer Guide Following is an example AWS CLI command for this operation. AWS CLI aws quicksight describe-iam-policy-assignment --aws-account-id AWSACCOUNTID --assignment-name ASSIGNMENT --namespace default For more information about the DescribeIAMPolicyAssignment API operation, see DescribeIAMPolicyAssignment in the Amazon QuickSight API Reference. ListIAMPolicyAssignments Use the ListIAMPolicyAssignments API operation to list IAM policy assignments in the current Amazon QuickSight account. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight list-iam-policy-assignments --aws-account-id AWSACCOUNTID --assignment-status ENABLED --namespace default --max-results 100 For more information about the ListIAMPolicyAssignments API operation, see ListIAMPolicyAssignments in the Amazon QuickSight API Reference. ListIAMPolicyAssignmentsForUser Use the ListIAMPolicyAssignmentsForUser API operation to list all the IAM policy assignments, including the Amazon Resource Names (ARNs), for the IAM policies assigned to the specified user and the groups that |
amazon-quicksight-dg-031 | amazon-quicksight-dg.pdf | 31 | the DescribeIAMPolicyAssignment API operation, see DescribeIAMPolicyAssignment in the Amazon QuickSight API Reference. ListIAMPolicyAssignments Use the ListIAMPolicyAssignments API operation to list IAM policy assignments in the current Amazon QuickSight account. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight list-iam-policy-assignments --aws-account-id AWSACCOUNTID --assignment-status ENABLED --namespace default --max-results 100 For more information about the ListIAMPolicyAssignments API operation, see ListIAMPolicyAssignments in the Amazon QuickSight API Reference. ListIAMPolicyAssignmentsForUser Use the ListIAMPolicyAssignmentsForUser API operation to list all the IAM policy assignments, including the Amazon Resource Names (ARNs), for the IAM policies assigned to the specified user and the groups that the user belongs to. To find a user name, call the ListUsers API operation. Following is an example AWS CLI command for this operation. ListIAMPolicyAssignments 104 Amazon QuickSight AWS CLI Developer Guide aws quicksight list-iam-policy-assignments-for-user --aws-account-id AWSACCOUNTID --user-name USER --max-results 100 --namespace default For more information about the ListIAMPolicyAssignmentsForUser API operation, see ListIAMPolicyAssignmentsForUser in the Amazon QuickSight API Reference. UpdateIAMPolicyAssignment Use the UpdateIAMPolicyAssignment API operation to update an existing IAM policy assignment. This operation updates only the optional parameters that are specified in the request. It overwrites all of the users included in Identities. To find a policy assignment name, call the ListIAMPolicyAssignments or ListIAMPolicyAssignmentsForUser API operations. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight update-iam-policy-assignment --aws-account-id AWSACCOUNTID --assignment-name NAME --namespace default --assignment-status ENABLED --policy-arn 222244446666 --identities KEY=VALUE,VALUE,KEY=VALUE,VALUE You can also make this command using a CLI skeleton file with the following command. For more information about CLI skeleton files, see Use CLI skeleton files. aws quicksight update-iam-policy-assignment --cli-input-json file://updateiampolicyassignment.json UpdateIAMPolicyAssignment 105 Amazon QuickSight Developer Guide For more information about the UpdateIAMPolicyAssignment API operation, see UpdateIAMPolicyAssignment in the Amazon QuickSight API Reference. Ingestion operations With ingestion API operations, you can perform actions on QuickSight ingestions. For more information, see the following API operations. Topics • CancelIngestion • CreateIngestion • DescribeIngestion • ListIngestions CancelIngestion Use the CancelIngestion operation to cancel an ongoing ingestion of data into SPICE. To use this operation, you need the ID of the dataset that is undergoing the ingestion that you want to cancel and the ID of the ingestion you want to cancel. You can use the ListDataSets operation to list all datasets and their corresponding dataset IDs. You can use the ListIngestions operation to list all ingestion IDs. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight cancel-ingestion --aws-account-id AWSACCOUNTID --data-set-id DATASETID --ingestion-id INGESTIONID For more information about the CancelIngestion operation, see CancelIngestion in the QuickSight API Reference. CreateIngestion Use the CreateIngestion to create and start a new SPICE ingestion on a dataset. Ingestion operations 106 Amazon QuickSight Developer Guide Any ingestions operating on tagged datasets inherit the same tags automatically for use in access control. For an example, see How do I create an IAM policy to control access to Amazon EC2 resources using tags? in the AWS Knowledge Center. Tags are visible on the tagged dataset, but not on the ingestion resource. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight create-ingestion --data-set-id DATASETID --ingestionid INGESTIONID --aws-account-id AWSACCOUNTID You can also make this command using a CLI skeleton file with the following command. For more information about CLI skeleton files, see Use CLI skeleton files. aws quicksight create-ingestion --cli-input-json file://createingestion.json For more information about the CreateIngestion operation, see CreateIngestion in the QuickSight API Reference. DescribeIngestion Use the DescribeIngestion operation to describe a SPICE ingestion. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight describe-ingestion --aws-account-id AWSACCOUNTID --data-set-id DATASETID --ingestion-id INGESTIONID For more information about the DescribeIngestion operation, see DescribeIngestion in the QuickSight API Reference. DescribeIngestion 107 Amazon QuickSight ListIngestions Developer Guide Use the ListIngestions operation to list the history of SPICE ingestioned for a dataset. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight list-ingestions --data-set-id DATASETID --aws-account-id AWSACCOUNTID --page-size 10 --max-items 100 For more information about the ListIngestions operation, see ListIngestions in the QuickSight API Reference. IP and VPC endpoint restriction operations With IP and VPC endpoint restriction API operations, you can perform actions on QuickSight IP and VPC endpoint restrictions. For more information, see the following API operations. Topics • DescribeIpRestriction • UpdateIpRestriction • QuickSight and interface VPC endpoints (AWS PrivateLink) DescribeIpRestriction Use the DescribeIpRestriction operation to get a summary and status of IP rules. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight describe-ip-restriction \ --aws-account-id AWSACCOUNTID ListIngestions 108 Amazon QuickSight Developer Guide For more information about the DescribeIpRestriction operation, see DescribeIpRestriction in the QuickSight API Reference. UpdateIpRestriction Use the UpdateIpRestriction operation to update the content and status of IP rules. To use this operation, provide the entire map of rules. You can use the DescribeIpRestriction operation to get |
amazon-quicksight-dg-032 | amazon-quicksight-dg.pdf | 32 | following API operations. Topics • DescribeIpRestriction • UpdateIpRestriction • QuickSight and interface VPC endpoints (AWS PrivateLink) DescribeIpRestriction Use the DescribeIpRestriction operation to get a summary and status of IP rules. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight describe-ip-restriction \ --aws-account-id AWSACCOUNTID ListIngestions 108 Amazon QuickSight Developer Guide For more information about the DescribeIpRestriction operation, see DescribeIpRestriction in the QuickSight API Reference. UpdateIpRestriction Use the UpdateIpRestriction operation to update the content and status of IP rules. To use this operation, provide the entire map of rules. You can use the DescribeIpRestriction operation to get the current rule map. Following is an example AWS CLI command for this operation. AWS CLI aws quicksight update-ip-restriction \ --aws-account-id AWSACCOUNTID You can also make this command using a CLI skeleton file with the following command. For more information about CLI skeleton files, see Use CLI skeleton files. aws quicksight update-ip-restriction \ --cli-input-json file://updateiprestriction.json For more information about the UpdateIpRestriction operation, see UpdateIpRestriction in the QuickSight API Reference. QuickSight and interface VPC endpoints (AWS PrivateLink) You can establish a private connection between your VPC and QuickSight by creating an interface VPC endpoint. Interface endpoints are powered by AWS PrivateLink, a technology that enables you to privately access the QuickSight website without leaving the Amazon network. Instances in your VPC don't need public IP addresses to communicate with QuickSight website, but still need access to certain domains other than QuickSight so that static assets, reports, and other files can be downloaded. For a list of domains that QuickSight needs to access, see Domains accessed by QuickSight. Each interface endpoint is represented by one or more Elastic Network Interfaces in your subnets. For more information, see Interface VPC endpoints (AWS PrivateLink) in the Amazon VPC User Guide. UpdateIpRestriction 109 Amazon QuickSight Developer Guide Considerations for QuickSight VPC endpoints Before you set up an interface VPC endpoint for QuickSight, ensure that you review Interface endpoint properties and limitations in the Amazon VPC User Guide. The following considerations apply to VPC endpoint restrictions in QuickSight: • The VPC endpoint that you create for QuickSight only works for the QuickSight website. QuickSight API calls are not supported through VPC endpoints. • QuickSight supports data sources from AWS services including Amazon S3, Amazon Redshift, and Athena. QuickSight needs access to the resources from your AWS accounts to retrieve this data. If you want traffic to other AWS services to be routed through the VPC endpoint, you need to create VPC endpoint connections for each service that your QuickSight account is configured to. For more information about connecting to a VPC connection with QuickSight, see Connecting to a VPC with QuickSight. • IP and VPC endpoint rules precede all other rules in QuickSight. If you have embedded dashboards or visuals that are visible to the public (anyone on the internet) and restrict traffic to the QuickSight website through a VPC endpoint, public dashboards can only be shared through the VPC endpoint. For more information on public embedding, see Turning on public access to visuals and dashboards with a 1-click embed code. • QuickSight VPC endpoints are not available in China regions. • QuickSight VPC endpoints are not available in GovCloud regions. Creating an interface VPC endpoint for QuickSight You can create a VPC endpoint for the QuickSight website using either the Amazon VPC console or the AWS Command Line Interface (AWS CLI). For more information, see Creating an interface endpoint in the Amazon VPC User Guide. Create a VPC endpoint for QuickSight using the following service name: • com.amazonaws.region.quicksight-website The private DNS names for the QuickSight website are not same as the public URL for QuickSight. To reach QuickSight through the public URL, create an A record for the website in the format <region>.quicksight.aws.amazon.com and point it to the VPC endpoint. For more VPC endpoints (AWS PrivateLink) 110 Amazon QuickSight Developer Guide information about routing to a VPC endpoint, see Routing traffic to an Amazon Virtual Private Cloud interface endpoint by using your domain name. The management of certain administrator features require that an administrator sign in to QuickSight as an IAM user. If you sign in through the VPC endpoint, you need to create the following VPC endpoints for the AWS Management Console. • com.amazonaws.region.console • com.amazonaws.region.signin For more information about VPC endpoints for the AWS Management Console, see Required VPC endpoints and DNS configuration. Creating a VPC endpoint policy for QuickSight You can attach an endpoint policy to your VPC endpoint to restrict usage of the endpoint to specific QuickSight accounts or to accounts under specific AWS organizations. The AWS account IDs that are allow–listed or deny–listed are the AWS accounts in which the QuickSight account is created. In most cases, this is the same account ID in which the VPC endpoint is created. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.