Let's populate a DynamoDB table using BatchWriteItem
.
I have a list of coupon codes for my product, and I want to move them into a DynamoDB table.
import uuid from "uuid/v4"
import { tableName } from "../src/shared/db/tableName"
import { documentClient } from "../src/shared/db"
const batchWriteItemMax = 25
const appSumoCodesNumber = 10000
const populateAppSumoCodes = async () => {
const codes = Array.from({ length: appSumoCodesNumber }, () => uuid())
const items = codes.map((id) => ({ id }))
const itemsBatchesNumber = Math.ceil(items.length / batchWriteItemMax)
await Promise.all(
Array.from({ length: itemsBatchesNumber }, (_, i) => i).map(
async (batchIndex) => {
const batchItems = items.slice(
batchWriteItemMax * batchIndex,
batchWriteItemMax * (batchIndex + 1)
)
await documentClient
.batchWrite({
RequestItems: {
[tableName.appSumoCodes]: batchItems.map((Item) => ({
PutRequest: {
Item,
},
})),
},
})
.promise()
}
)
)
}
populateAppSumoCodes()
First, I convert them into a list of items.
We can't just upload all of them at once. DynamoDB allows inserting a maximum of 25 items at once.
We need to calculate how many batches we should have. For example, if there are 60 items, we'll insert 3 batches.
After that, we create an array with the length of itemsBatchesNumber
, and use Promise.all
to iterate over it.
To populate the table, we slice a batch from a list of items and pass it to the batchWrite method of DynamoDB documentClient
.
Now let's set environment variables and run the migration with ts-node.