Cross-account cross-Region access to DynamoDB tables
Create a role
Follow step 1 in the tutorial to create an IAM role in account A. When defining the permissions of the role, you can choose to attach existing policies such as AmazonDynamoDBReadOnlyAccess, or AmazonDynamoDBFullAccess to allow the role to read/write DynamoDB. The following example shows creating a role named DynamoDBCrossAccessRole, with the permission policy AmazonDynamoDBFullAccess.
Grant access to the role
Follow step 2 in the tutorial in the IAM User Guide to allow account B to switch to the newly-created role. The following example creates a new policy with the following statement:
Then, you can attach this policy to the group/role/user you would like to use to access DynamoDB.
Assume the role in the AWS Glue job script
Now, you can log in to account B and create an AWS Glue job. To create a job, refer to the instructions at Configuring job properties for Spark jobs in AWS Glue.
In the job script you need to use the dynamodb.sts.roleArn parameter to
assume the DynamoDBCrossAccessRole role. Assuming this role allows you to
get the temporary credentials, which need to be used to access DynamoDB in account B.
Review these example scripts.
For a cross-account read across regions (ETL connector):
import sys from pyspark.context import SparkContext from awsglue.context import GlueContext from awsglue.job import Job from awsglue.utils import getResolvedOptions args = getResolvedOptions(sys.argv, ["JOB_NAME"]) glue_context= GlueContext(SparkContext.getOrCreate()) job = Job(glue_context) job.init(args["JOB_NAME"], args) dyf = glue_context.create_dynamic_frame_from_options( connection_type="dynamodb", connection_options={ "dynamodb.region": "us-east-1", "dynamodb.input.tableName": "test_source", "dynamodb.sts.roleArn": "<DynamoDBCrossAccessRole's ARN>" } ) dyf.show() job.commit()
For a cross-account read across regions (ELT connector):
import sys from pyspark.context import SparkContext from awsglue.context import GlueContext from awsglue.job import Job from awsglue.utils import getResolvedOptions args = getResolvedOptions(sys.argv, ["JOB_NAME"]) glue_context= GlueContext(SparkContext.getOrCreate()) job = Job(glue_context) job.init(args["JOB_NAME"], args) dyf = glue_context.create_dynamic_frame_from_options( connection_type="dynamodb", connection_options={ "dynamodb.export": "ddb", "dynamodb.tableArn": "<test_source ARN>", "dynamodb.sts.roleArn": "<DynamoDBCrossAccessRole's ARN>" } ) dyf.show() job.commit()
For a read and cross-account write across regions:
import sys from pyspark.context import SparkContext from awsglue.context import GlueContext from awsglue.job import Job from awsglue.utils import getResolvedOptions args = getResolvedOptions(sys.argv, ["JOB_NAME"]) glue_context= GlueContext(SparkContext.getOrCreate()) job = Job(glue_context) job.init(args["JOB_NAME"], args) dyf = glue_context.create_dynamic_frame_from_options( connection_type="dynamodb", connection_options={ "dynamodb.region": "us-east-1", "dynamodb.input.tableName": "test_source" } ) dyf.show() glue_context.write_dynamic_frame_from_options( frame=dyf, connection_type="dynamodb", connection_options={ "dynamodb.region": "us-west-2", "dynamodb.output.tableName": "test_sink", "dynamodb.sts.roleArn": "<DynamoDBCrossAccessRole's ARN>" } ) job.commit()