How to convert CSV to DynamoDB JSON in JavaScript
- Authors
- Name
- Geeks Kai
- @KaiGeeks
Ever stared at a CSV file knowing it needs to live in DynamoDB, but feeling like you're trying to plug a USB cable in upside down? Yeah, we've all been there. Your spreadsheet data and Amazon's NoSQL database speak different languages, and someone needs to play translator.
Here's the thing: you can't just drag and drop that CSV into DynamoDB and call it a day. But converting CSV to DynamoDB JSON in JavaScript? That's your cheat code to making this whole process smooth as butter.

Why Your CSV Won't Play Nice with DynamoDB
DynamoDB is picky about its data format—think of it as that friend who only drinks craft beer. It wants everything wrapped in specific JSON structures with explicit data types. Your CSV? It's just raw data sitting there like ingredients without a recipe.
When you try to import CSV directly into DynamoDB, everything gets treated as strings. Your numbers become text, your booleans turn into "true" and "false" strings, and your data integrity goes out the window faster than your weekend plans.
The solution? Convert that CSV to proper DynamoDB JSON format first. It's like meal prep for your database.
The Conversion Game Plan
Step 1: Set Up Your Toolkit
Before we dive into the code, you'll need Node.js running on your machine. If you don't have it yet, grab it—it's free and you'll use it for pretty much everything anyway.
Next, install the csvtojson
library. This little gem does the heavy lifting:
npm install csvtojson
Step 2: Build Your CSV to DynamoDB JSON Converter
Create a new file called convert.js
(or whatever makes you happy) and drop this code in:
const csv = require('csvtojson');
const fs = require('fs');
const csvFilePath = 'data.csv'; // Your CSV file path
const jsonFilePath = 'output.json'; // Where the magic happens
csv()
.fromFile(csvFilePath)
.then((jsonObj) => {
const dynamoDBJson = jsonObj.map(item => ({
Item: {
col1_str: { S: item.col1_str },
col2_num: { N: item.col2_num.toString() },
col3_bool: { BOOL: item.col3_bool === 'true' }
}
}));
fs.writeFileSync(jsonFilePath, JSON.stringify(dynamoDBJson, null, 2));
console.log('Conversion completed. Check output.json for results.');
})
.catch(err => {
console.error('Error converting CSV to JSON:', err);
});
What's happening here? We're reading your CSV, mapping each row to DynamoDB's format, and saving it as clean JSON. The key is those data type indicators:
S
for stringsN
for numbers (converted to strings, because DynamoDB is weird like that)BOOL
for booleans
Step 3: Fire It Up
Run your conversion script:
node convert.js
Boom. Your CSV data is now speaking DynamoDB's language.
Real-World Example: See It in Action
Let's say you've got this CSV:
col1_str,col2_num,col3_bool
"str1",100,true
"str2",200,false
"str3",300,true
After running through our converter, you'll get this beautiful DynamoDB JSON:
[
{
"Item": {
"col1_str": { "S": "str1" },
"col2_num": { "N": "100" },
"col3_bool": { "BOOL": true }
}
},
{
"Item": {
"col1_str": { "S": "str2" },
"col2_num": { "N": "200" },
"col3_bool": { "BOOL": false }
}
},
{
"Item": {
"col1_str": { "S": "str3" },
"col2_num": { "N": "300" },
"col3_bool": { "BOOL": true }
}
}
]
Clean, structured, and ready for DynamoDB to devour.
Getting Your Data Into DynamoDB
Once you've got your converted JSON, importing it is straightforward. Use AWS CLI:
aws dynamodb batch-write-item --request-items file://output.json
Or if you're feeling fancy, use the AWS SDK in your application. Either way, your data slides right in.
Pro Tips for Smooth Conversions
Data Type Mapping Cheat Sheet:
- Text →
{ S: "value" }
- Numbers →
{ N: "123" }
- Booleans →
{ BOOL: true/false }
- Lists →
{ L: [...] }
- Maps →
{ M: {...} }
Common Gotchas to Avoid:
- Always validate your CSV structure first
- Handle empty cells gracefully
- Test with small datasets before going full scale
- Double-check your primary key mappings
Performance Hacks:
- Process large files in chunks
- Use batch operations for imports
- Monitor your DynamoDB write capacity
Level Up Your Data Game
Converting CSV to DynamoDB JSON in JavaScript isn't rocket science, but doing it right saves you hours of debugging later. This approach gives you full control over data types, handles edge cases, and keeps your database happy.
Think of it as the difference between throwing ingredients in a pot versus following a recipe. Both might work, but one definitely tastes better.
Ready to flex your data conversion skills? Take this code, adapt it to your specific CSV structure, and watch your data flow into DynamoDB like it was always meant to be there. Your future self will thank you for the clean, properly typed data.
Got questions about handling complex CSV structures or need to scale this up? Drop a comment below—let's figure it out together.