In the world of data, developers constantly act as translators. We receive data in JSON from a web API, but the analytics team needs it in CSV for their spreadsheet magic. A backend process generates a complex XML file, but a partner system only accepts a flat CSV. This constant need for data transformation—reshaping, cleansing, and converting formats—can lead to a maze of brittle, one-off scripts that are a nightmare to maintain.
What if you could treat data transformation not as a chore, but as a service? What if you could define complex ETL (Extract, Transform, Load) logic as simple, version-controlled code and execute it with a single API call?
This is the promise of transform.do. In this guide, we'll walk you through a practical, step-by-step tutorial on using the transform.do API to automate the conversion of complex JSON structures into clean, usable CSV files.
JSON (JavaScript Object Notation) and CSV (Comma-Separated Values) are two of the most common data formats, each with its own strengths.
The challenge arises when you need to bridge the gap between them. Manually converting a nested JSON file into a flat CSV can be tedious and error-prone. You have to consider:
Writing a custom script for every new data source is inefficient. This is where a dedicated transformation service shines.
transform.do reimagines ETL as a developer-first experience. Instead of wrestling with complex data-wrangling code, you simply declare your desired outcome in a simple configuration object.
You define the source data and the transformation rules, and our AI-powered agents handle the heavy lifting of parsing, mapping, and formatting. Your entire data pipeline becomes a simple, version-controlled workflow you can call from anywhere.
Let's get practical. Imagine you have an array of customer orders in JSON, complete with nested customer details and a list of items for each order.
Here’s the complex JSON we want to convert. Notice the nested customer object and the items array.
Our goal is to turn this into a flat CSV file where each row represents a single item purchased.
First, we'll initialize the transform.do agent using the SDK. This gives us a client ready to perform transformations.
This is where the magic happens. We'll create a single object that tells the agent exactly how to convert our JSON. We want to specify the target format (csv), flatten the nested items array, rename fields for clarity, format the date, and even create a new calculated field.
Look at how powerful this declarative approach is. We didn't write any loops or parsers. We simply described the end result:
With our source data and rules defined, executing the transformation is a single, asynchronous function call.
The result.data will contain our beautifully formatted CSV string, ready to be saved to a .csv file or sent to another service.
Just like that, we've flattened, mapped, converted, and enriched our data with a single API call.
The real power of transform.do emerges when you think beyond one-off conversions.
Stop writing brittle data conversion scripts. By adopting a "Transformation as a Service" model with transform.do, you can turn your most complex ETL headaches into a single, declarative API call. You get cleaner code, more reliable pipelines, and more time to focus on building features, not fighting formats.
Ready to simplify your data pipelines? Visit transform.do to get started and turn your ETL headaches into an elegant service.
[
{
"orderId": "ORD-123",
"customer": { "id": 101, "name": "Alice Johnson" },
"items": [
{ "sku": "SKU-A", "quantity": 2, "price": 10.00 },
{ "sku": "SKU-B", "quantity": 1, "price": 25.50 }
],
"orderDate": "2024-03-15T11:30:00Z"
},
{
"orderId": "ORD-124",
"customer": { "id": 102, "name": "Bob Williams" },
"items": [
{ "sku": "SKU-C", "quantity": 5, "price": 5.75 }
],
"orderDate": "2024-03-16T14:00:00Z"
}
]
import { Agent } from "@do/sdk";
// Initialize the transformation agent
const transform = new Agent("transform.do");
const sourceData = [
// ... (paste the JSON from Step 1 here)
];
const transformations = {
targetFormat: "csv",
rules: [
// The agent will intelligently flatten the 'items' array,
// creating a new row for each item and duplicating parent data.
{ flatten: "items" },
// Map and rename fields. Use dot notation for nested source fields.
{ rename: {
"orderId": "Order ID",
"customer.id": "Customer ID",
"customer.name": "Customer Name",
"orderDate": "Date",
"items.sku": "Product SKU",
"items.quantity": "Quantity",
"items.price": "Unit Price"
}
},
// Convert field types and formats.
{ convert: { "Date": "date('YYYY-MM-DD')" } },
// Add a new field calculated from existing data.
{ addField: { "Line Total": "{{Quantity}} * {{Unit Price}}" } }
]
};
// Execute the transformation
const result = await transform.run({
source: sourceData,
transform: transformations
});
console.log(result.data);
Order ID,Customer ID,Customer Name,Date,Product SKU,Quantity,Unit Price,Line Total
ORD-123,101,Alice Johnson,2024-03-15,SKU-A,2,10.00,20.00
ORD-123,101,Alice Johnson,2024-03-15,SKU-B,1,25.50,25.50
ORD-124,102,Bob Williams,2024-03-16,SKU-C,5,5.75,28.75