Skip to content
Processing 10 bundl...
 
Notifications
Clear all

Processing 10 bundles at a time or splitting CSV

4 Posts
3 Users
0 Reactions
4 Views
Javier_Vinas
(@javier_vinas)
Posts: 2
New Member
Topic starter
 

Hi automation enthusiasts!

I'm facing a challenge and could use some advice on how to tackle it.

Essentially, I need to extract a CSV file from a Gmail attachment, transform it into a specific JSON structure, and then upload it to an Airtable base. I've successfully set up all the individual steps, and they are working. However, the Airtable API requests have a limitation of accepting a maximum of 10 records per API call.

Therefore, I believe I need to implement a method to:

  • Segment the CSV file into batches of 10 rows, OR process the JSON file in chunks of 10 records at a time.
  • Iterate through these chunks until the entire file has been processed.

Here's a look at my current setup for reference:

My JSON module contains 25 records. I need to process them in batches of 10, then another 10, and finally the remaining 5. If that makes sense.

I would appreciate some guidance on this matter.

:slight_smile:

Best regards

 
Posted : 01/02/2023 11:07 am
Francisco_Fontes
(@francisco_fontes)
Posts: 7
Active Member
 

Hello, I'm sharing a method to accomplish what you're asking for.

Process executed

99 records from the CSV

image

Iterating the array in chunks of 10 elements

image

Splitting the array into segments of 10 elements

image

Generating JSON for Airtable entry

Bulk data entry into Airtable

 
Posted : 01/02/2023 12:22 pm
Javier_Vinas
(@javier_vinas)
Posts: 2
New Member
Topic starter
 

Hi Francisco,

Brilliant work! Thank you very much for providing all the instructions.

I haven't tested the solution in my scenario yet, but I'm quite confident it will work. I've marked it as the solution.

Thanks, maestro!

:smile:

 
Posted : 01/02/2023 1:34 pm
gneuman
(@gneuman)
Posts: 1
New Member
 

Hello, here's how I'd approach solving this issue. It requires some coding, but the advantage is that you can upload more records per run.

  1. Create a large JSON file containing 100 or more records and send it to Airtable.
  1. In Airtable, set up a simple automation triggered by a new record. This will require running a JavaScript, which I believe is available in the Pro plan.
  1. This is the script you'll need to use:

```javascript
let params = input.config();
let contentId = params.contentId;
let jsonCSV = params.jsonCSV;

let records = JSON.parse(jsonCSV);

let createArr = records.map(obj => {
return {
fields: {
“IP”: obj.IP,
“Sexo”: obj.Sexo,
“Email”: obj.Email,
“Nombre”: obj.Nombre,
“Apellido”: obj.Apellido,
}
}
})

let table = base.getTable(“Records”);

while(createArr.length > 0){
await table.createRecordsAsync(createArr.slice(0,50));
createArr = createArr.slice(50);
}
```

I recorded a video (in Spanish) demonstrating this process, which you can view here: ¿Cómo crear más de 50 récords al mismo tiempo en Airtable? - YouTube

 
Posted : 10/02/2023 3:35 pm
Share: