From Source to Destination: The Art of DynamoDB Migration
Migrate DynamoDB tables across AWS accounts using S3 as an intermediary. Export from source account using export-table-to-point-in-time, create an S3 bucket in the destination with cross-account access, then import using import-table with a table creation parameters file.
Introduction
AWS DynamoDB is a managed NoSQL database service designed to offer seamless scalability, high performance, and availability. Unlike traditional relational databases with structured schemas, NoSQL databases like DynamoDB are schema-less, making them exceptionally flexible and adaptable to varying data requirements.
Key advantages of DynamoDB:
- Automatic scaling without downtime or performance degradation
- Fully managed service by AWS (no administration work)
- Data replication across multiple AWS data centers for high availability
- Easy integration with other AWS services
- Cost-effective - pay only for capacity you use with no upfront fees
This post explains how to migrate DynamoDB tables to another AWS account using the export/import method via S3. This approach works well for one-time migrations or periodic data transfers.
Prerequisites
- Access to both source and destination AWS accounts
- Permissions to:
- Create an S3 bucket in the target AWS account
- Backup the source DynamoDB table
- Create and restore DynamoDB tables in the target account
- Access AWS CLI or CloudShell
Logical Architecture

Steps
Destination Account - Create S3 Bucket
Create S3 Bucket for Backup
# Set the variable for the bucket name
BUCKET_NAME="dynamodb-backup"
# Create the bucket
aws s3api create-bucket --bucket $BUCKET_NAME --region us-east-1
Create Bucket Policy for Cross-Account Access
# Create bucket policy allowing both accounts access
echo '{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"AWS": [
"arn:aws:iam::SOURCE_ACCOUNT_ID:root",
"arn:aws:iam::DESTINATION_ACCOUNT_ID:root"
]
},
"Action": "s3:*",
"Resource": ["arn:aws:s3:::dynamodb-backup", "arn:aws:s3:::dynamodb-backup/*"]
}
]
}' > bucketpolicy.json
# Apply the bucket policy
aws s3api put-bucket-policy --bucket $BUCKET_NAME --policy file://bucketpolicy.json
Replace SOURCE_ACCOUNT_ID and DESTINATION_ACCOUNT_ID with your actual AWS account IDs. Using s3:* is permissive - consider restricting to only the actions needed (s3:PutObject, s3:GetObject, s3:ListBucket).
Source Account - Backup DynamoDB
# Export DynamoDB table to S3 bucket in destination account
aws dynamodb export-table-to-point-in-time \
--region us-east-1 \
--table-arn arn:aws:dynamodb:us-east-1:SOURCE_ACCOUNT_ID:table/tblARUN \
--s3-bucket dynamodb-backup \
--s3-bucket-owner DESTINATION_ACCOUNT_ID
Check the export status:
aws dynamodb describe-export --export-arn [ARN from backup command]
The backup creates subfolders in S3 at AWSDynamoDB/[random-guid]/data. Note this path - you'll need it for the restore command.
Wait until status shows 'Completed' before proceeding to restore.
Destination Account - Restore DynamoDB
Create Table Parameters File
Create a file called dbparameters.json:
{
"TableName": "tblDANIEL",
"AttributeDefinitions": [
{
"AttributeName": "ID",
"AttributeType": "S"
},
{
"AttributeName": "State",
"AttributeType": "S"
}
],
"KeySchema": [
{
"AttributeName": "ID",
"KeyType": "HASH"
},
{
"AttributeName": "State",
"KeyType": "RANGE"
}
],
"BillingMode": "PROVISIONED",
"ProvisionedThroughput": {
"ReadCapacityUnits": 5,
"WriteCapacityUnits": 5
}
}
Import Table from S3
aws dynamodb import-table \
--s3-bucket-source S3Bucket=dynamodb-backup,S3KeyPrefix=AWSDynamoDB/[random-guid]/data \
--input-format DYNAMODB_JSON \
--table-creation-parameters file://dbparameters.json \
--input-compression-type GZIP
Check Restore Status
aws dynamodb describe-table --table-name tblDANIEL
Troubleshooting
- Export fails with AccessDenied - Verify the S3 bucket policy allows the source account. Check that your IAM role has dynamodb:ExportTableToPointInTime permission.
- Import fails with "table already exists" - Delete the existing table first or use a different table name in dbparameters.json.
- Import fails with schema mismatch - Ensure the KeySchema in dbparameters.json matches the source table's key structure exactly (same attribute names and types).
- Cannot find export path in S3 - The export creates nested folders. Navigate through AWSDynamoDB/ to find the GUID folder containing your export data.
- Export stuck in IN_PROGRESS - Large tables take time. Check CloudWatch for errors. Ensure the source table has Point-in-Time Recovery enabled.
- Permission denied on bucket - Verify the bucket policy includes both account IDs. Check that the IAM user/role has S3 permissions.
Conclusion
AWS DynamoDB presents itself as a formidable choice for businesses looking for a flexible, scalable, and efficient NoSQL database solution. Beyond the numerous advantages of this service, another significant benefit is the capability to easily backup and restore tables either within the same AWS account or to a separate AWS account using S3 as an intermediary.