Hello!
New to callin.io, appreciate the assistance. I have a workflow that makes an API call to Shopify to retrieve collections in our shop, then gathers the products within those collections. It then filters these products using specific tags and finally creates records in Airtable. The current workflow does this, but it's only capturing one product per collection (resulting in 42 products from 42 collections). I'm looking to have it process all products within each collection to filter out those with specific tags. Is this achievable? The flow and steps are detailed below. Thanks!
Flow
Collections (Bundles), can see products within when expanded
Only 42 Products Retrieved from the 42 Collections
Filter with specific Tags Yields 20 Results
Is it possible to loop through the products within the collection bundles to read and filter each one? Or is there another workaround to achieve the ultimate goal?
Thanks!
How did you map those iterators?
Also, your filter seems to be set up incorrectly. You need to specify the bundle number within those brackets, like this: 3.Body.products[__BUNDLE NUMBER__]:tags
Hi,
Thanks for your message. I utilized the pre-built "Export Shopify Collections to Google Sheets" template, but modified it to use Airtable at the end.
Here's the workflow I'm trying to achieve. I'm limited to 10 images per post, so I'll try to focus on the main challenge.
-
callin.io retrieves the shop collections from Shopify. This step is functioning correctly.
-
The iterator then separates each individual shop collection into distinct bundles (there are 42 in total). This also works. Here's how it's mapped:
Example: Bundle 5 represents the collection title "2024 Lunar New Year Bracelets".
-
Shopify retrieves the products from each shop collection (bundle from step 2). These products are then output into new bundles, with "Body" and "Header" representing the collections. Within the "Body" section, I can see the products associated with each shop collection bundle from step 2. For instance, Bundle 5, "2024 Lunar New Year Bracelets," has 5 associated products. I've included a screenshot showing the drill-down into individual products, where you can see the Title, Body HTML, and Image URL – all elements I map later.
-
My goal is to iterate through the "Body" (Collection) within each bundle from the previous step. This should result in 42 shop collections (bundles) multiplied by the number of products associated with each. However, I am only getting 42 results, one for each collection. The mapping for the iterator is shown here:
-
Next, I intend to filter the products from all bundles to extract those with the specific tags shown in the screenshot:
-
The test parser then processes the product descriptions for those that pass the filters, converting them into plain text. This part works, but again, it's only processing 42 products instead of the hundreds I expect.
-
Finally, the data is mapped into Airtable, producing the desired outcome: the Title, translated HTML product description, Image URL, and the associated filter tag for each product.
So, how can I adjust the process to iterate through every product in each bundle, ensuring that all relevant products pass through my filter? This would allow me to have approximately 700 entries in Airtable, rather than the current 42.
Thanks for your assistance!
Thank you very much, that worked perfectly for retrieving the information I need and it helps me better understand how to use the modules. However, it keeps timing out before it finishes reading the products.
Is there a workaround for that error? Or would it be beneficial if I reconfigured to select specific shop collections (bundles)? If so, would I write something like this to choose Bundle 5, for example:
Thanks!
You can utilize the sleep module to introduce a pause of a second or two between modules, or the break module to configure retries after a specified duration upon failure.
Just wanted to mention that this was also part of the solution. Thank you so much for your assistance, everything is functioning perfectly now. I truly appreciate it!