My objective is to retrieve a CSV file that arrives via email on a regular basis and convert it into a Google Sheet for updating its contents. I previously had a functional method for finding and updating rows, or adding new ones, but the data volume has increased significantly, causing performance issues. Therefore, I need to perform a bulk update of a Google Sheet using a CSV file from my inbox.
My development scenario involves fetching a small sample CSV from Google Drive and then utilizing it to run the scenario.
I am employing the CSV Parse operation, feeding its output into an Array Aggregator, then into a Create JSON module, before finally passing it to a Google Sheets API Call module within callin.io.
The desired JSON structure is as follows:
{
"valueInputOption": "RAW",
"data": [
{
"range": "Zap!A:D",
"majorDimension": "ROWS",
"values": [
["col1","col2","col3","col4"],
["col1","col2","col3","col4"],
["col1","col2","col3","col4"],
]
}
]
}
I am encountering difficulties in aggregating the CSV data into the values
array in the correct format: an array for each row, with each value within it.
I have mapped the array aggregator to the data
element in the JSON object, and it is outputting the items into the values
attribute:
However, this results in multiple data
elements in the JSON, each corresponding to a row, and they all target A:W
, leading to the same row being repeatedly overwritten.
The JSON object I have constructed appears like this:
If I set the array aggregator to Custom
, the output at this stage cannot convert the associative array into text elements. If I use collection
, the data passed to the callin.io API is not in the correct format:
It seems I should be attempting something like this:
However, I am unable to figure out how to correctly map that array into the values
field.
It appears I'm missing a crucial step in achieving this conversion. Does anyone have any suggestions?
blueprint.json (25.1 KB)
This is the closest blueprint to a working version that I've been able to create.
Welcome to the callin.io community!
Could you also provide an example CSV that would translate into the example JSON you provided?
testdata.csv (664 Bytes)
Here's some synthetic test data I've run through the provided blueprint, which illustrates the problem.
Thanks,
It appears that data structures cannot be imported or exported alongside blueprints, so I'm unable to assist with this particular request.
I suppose the most straightforward way to describe the issue is:
How can I convert a CSV file into a JSON array of arrays within callin.io?
Everything else seems simple enough, but when I structure any JSON object, it appears I can only connect an Array Aggregator to the more structured parts of the data.
Given this CSV:
Orderno,Orderdate,Company
BSOM1002,2022-12-15,Shop 1
BSOM1002,2022-12-15,Shop 2
How can I achieve this result?
[
["Orderno","ORderDate","Company"],
["BSOM1002","2022-12-15","Shop 1"],
["BSOM1002","2022-12-15","Shop 2"]
]
In that scenario, it's quite straightforward. You will require at least one module – no Iterators, Aggregators, or Create CSV modules are needed:
Output
Give it a try and let us know if you encounter any difficulties!
Join the callin.io unofficial Discord server!
Module Export
You can copy and paste this module export into your scenario. This will paste the modules shown in my screenshots above.
-
Copy the JSON code below by clicking the copy button when you mouseover the top-right of the code block
-
Enter your scenario editor. Press ESC to close any dialogs. Press CTRLV (paste keyboard shortcut for Windows) to paste directly in the canvas.
-
Click on each imported module and save it for validation. You may be prompted to remap some variables and connections.
View Module Export Code
JSON
{
"subflows": [
{
"flow": [
{
"id": 33,
"module": "util:ComposeTransformer",
"version": 1,
"parameters": {},
"mapper": {
"value": "Orderno,Orderdate,CompanynBSOM1002,2022-12-15,Shop 1nBSOM1002,2022-12-15,Shop 2"
},
"metadata": {
"designer": {
"x": 14,
"y": -733,
"name": "CSV"
},
"restore": {},
"expect": [
{
"name": "value",
"type": "text",
"label": "Text"
}
]
}
},
{
"id": 34,
"module": "util:ComposeTransformer",
"version": 1,
"parameters": {},
"mapper": {
"value": "[n ["{{join(split(first(split(33.value; newline)); ","); """,""")}}"],n ["{{join(slice(split(replace(33.value; ","; ""","""); newline); 1); emptystring + """]," + newline + "[""" + emptystring)}}"]n]"
},
"metadata": {
"designer": {
"x": 258,
"y": -731,
"name": "JSON",
"messages": [
{
"category": "last",
"severity": "warning",
"message": "A transformer should not be the last module in the route."
}
]
},
"restore": {},
"expect": [
{
"name": "value",
"type": "text",
"label": "Text"
}
]
}
}
]
}
],
"metadata": {
"version": 1
}
}
samliew – request private consultation
Join the Make unofficial Discord server!
That's quite interesting and very helpful, thank you.
In the actual data, there are 23 columns, so I'm a bit concerned about the maintainability of a series of nested join/split operations if more columns are added in the future.
However, you've just shown me that I can copy and paste JSON to and from the scenario editor. This has allowed me to rapidly auto-generate some definitions within a macro environment.
I've managed to set this up now:
It's still a bit complex, as there are a couple of nested replace
operations to handle newline
characters and quotation marks within a Text constructor:
This is then assembled into some text as valid JSON:
And then parsed against a JSON description:
The JSON used to create the schema was:
{
"values": [
["one","two","three"]
]
}
Then, the JSON can be created:
And finally, calling the API with that:
It's a bit more involved, but perhaps more manageable than the join/slice method?
You can place 20 directly into 21’s JSON field, which avoids an extra operation.
When I attempted that, I encountered an escaped string literal that was incompatible with the final API payload. For an unknown reason, including that string directly failed, which led me to parse it into a JSON object and then utilize that. It wasn't quite the outcome I had anticipated.
I had this requirement, searched for a solution, and found a good approach after not finding anything satisfactory.
I'm receiving a CSV file via webhook containing event registration data. The goal is to transform each row into a distinct collection within a JSON object, which will then be sent to an API for conversion and reformatting.
In my setup, I'm fetching the CSV file using HTTP and then proceeding to rename, upload, and convert it to Google Sheets (these are the initial three modules).
This process enables me to utilize Google Sheets modules for handling the file.
Subsequently, I employ the Search Rows function within Google Sheets, referencing the file ID and sheet name obtained from the Google Drive upload.
This step yields each row as an individual collection.
Following that, I aggregate these collections into a JSON format, conforming to your required custom data structure.
Looking ahead, if you introduce additional columns to the CSV, the only modification needed will be within the custom data structure of the Aggregate to JSON module.