Product
개요영상그래픽문서
Enterprise
Story
레터/테크 노트소식/공지
Pricing
Ko
한국어English日本語日本語
이용 가이드
레터웍스 시작 
한국어English日本語
한국어English日本語
Cloning DynamoDB Table to Another AWS Account
2022-11-03

‍

Cloning(Save and Restore) DynamoDB Table to Another AWS Account

‍

By Hyuntaek Park

Senior full-stack engineer at Twigfarm

‍

At Twigfarm, we created a new AWS account and some of the DynamoDB tables should be copied from the old account to the new account. I thought it would be a very simple process like other database systems have a dump and restore feature. But it wasn’t that simple.

‍

I tried a few different ways to achieve our goal but nothing really was simple enough. Luckily, one of the AWS Solutions Architects introduced me to a new solution that AWS just released: https://aws.amazon.com/blogs/database/amazon-dynamodb-can-now-import-amazon-s3-data-into-a-new-table. You should read the above article first to get a sense of how it is done.

‍

In this article I will demonstrate how I save the DynamoDB table into the S3 which is in a different AWS account, then create the DynamoDB table from what is saved in S3.

‍

Architecture

Architecture is fairly simple. In this article, I will explain how to save and restore. Some are done using console and some are done through the AWS CLI.

image

‍

Prerequisites

You need to have your AWS CLI profiles for both source and destination accounts. In this article, I use source-user and destination-user for profile names. Please refer to https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-profiles.html.

‍

DynamoDB Table

I have a simple DynamoDB table in the source account. Our final goal is to have a table with those two items in the destination account.

image

‍

Enable Point-in-time recovoer(PITR)

Choose the source DynamoDB table > Bakcups > Edit button under Point-in-time recovery (PITR). Then enable the point-in-time-recovery feature.

‍

Destination S3 Bucket

Log into the destination AWS account. It is convenient if you use a different web browser or open a new window with incognito mode.

Create a bucket as the following:

Bucket name: <YOUR_UNIQUE_BUCKET_NAME>

Choose ACL enabled and Object writer for Object Ownership.

image

‍

S3 Bucket Policy

Copy and paste the following JSON for your destination bucket. Ensure to replace SOURCE_ACCOUNT_NO, SOURCE_USER_NAME, and DESTINATION_BUCKET_NAMEwith your own.

‍

Export table to Amazon S3 using Command Line Interface(CLI)

Enter the following command in your terminal.

‍

image

‍

If there is no error after entering the command, you would see one line under Exports to S3. Wait a few minutes. The status will change from Exporting to Completed. Now the exporting part is done.

‍

Restore from the S3

Now you go to DynamoDB > Imports from S3. Then click Import from S3 button.

Click Browse S3 button and drill down folders until you see the data folder and choose the file with json.gz extension.

image
image

‍

Then You fill out the form to create a new DynamoDB table.

‍

Verification

Go to DynamoDB. Check the if table items are successfully imported such as the following.

image

‍

If you see the identical table contents, congratulations! Imports from S3 feature is a new feature. Without the feature, you would have struggled with many complicated AWS services and permissions. Here with the Imports from S3feature, only S3 and DynamoDB are the services that we need to consider.

‍

Thanks!

‍

전체 목록 보기

다음 노트 살펴보기

WORKS note
월간 인공지능 2호 영상 기반 음성-텍스트 변환 솔루션 비교하기(4편_생성 결과물 비교)
2025-04-28
WORKS note
월간 인공지능 2호 영상 기반 음성-텍스트 변환 솔루션 비교하기(3편_생성 결과물 비교)
2025-04-25
WORKS note
월간 인공지능 2호 영상 기반 음성-텍스트 변환 솔루션 비교하기(2편_인터페이스 비교)
2025-04-23
이용 가이드
문의하기
(주) 트위그팜
사업자등록번호 : 556-81-00254  |  통신판매번호 : 2021-서울종로-1929
대표 : 백선호  |  개인정보관리책임자 : 박현택
서울 본사 : (03187) 서울 종로구 6(서린동) 6층
광주 지사 : (61472) 광주광역시 동구 금남로 193-22
싱가폴 아시아 지사 : (048581) 16 RAFFLES QUAY #33-07 HONG LEONG BUILDING SINGAPORE
Family site
TwigfarmLETR LABSheybunny
이용약관
|
개인정보처리방침
ⓒ 2024 LETR WORKS. All rights reserved.