Skip to content
How to implement co...
 
Notifications
Clear all

How to implement concurrency in callin.io, such as calling API concurrently

7 Posts
5 Users
0 Reactions
6 Views
xuxiang
(@xuxiang)
Posts: 3
Active Member
Topic starter
 

Scenario:

  • I am working with callin.io to automate a workflow that involves processing a large volume of data.
  • My workflow involves reading 100 rows of data from an upstream database.
  • Each row of data needs to be processed by making an API call.
  • The result from the API call is then written back to the database.

Requirement:

  • I want to optimize the workflow to execute API calls and database writes concurrently.
  • Specifically, I am aiming for 10 concurrent executions to improve efficiency and reduce processing time.
 
Posted : 30/12/2023 8:37 pm
n8n
 n8n
(@n8n)
Posts: 97
Trusted Member
 

It appears your topic is missing some crucial details. Could you please provide the following information, if relevant?

  • callin.io version:
  • Database (default: SQLite):
  • callin.io EXECUTIONS_PROCESS setting (default: own, main):
  • Running callin.io via (Docker, npm, callin.io cloud, desktop app):
  • Operating system:
 
Posted : 30/12/2023 8:37 pm
jan
 jan
(@jan)
Posts: 39
Eminent Member
 

Welcome to the community!

That is possible with the HTTP Request node by adding the option “Batching”:

Screenshot from 2023-12-30 21-38-17

 
Posted : 30/12/2023 8:39 pm
samaritan
(@samaritan)
Posts: 1
New Member
 

Hi there,

If your requirement is confined to API requests, the advice provided earlier will work perfectly. However, for greater flexibility, I recommend using RabbitMQ. After obtaining the data, your flow can process it by pushing messages (items) to a queue and then processing them in a separate workflow using a RabbitMQ trigger. This approach allows you to combine optimal options for your specific case, such as limiting parallel processing to X instances and acknowledging messages upon flow completion, etc.

Hope this helps.

 
Posted : 31/12/2023 12:40 am
xuxiang
(@xuxiang)
Posts: 3
Active Member
Topic starter
 

My typical workflow involves receiving tasks via HTTP from an upstream source, executing data integration processing, and then requiring a downstream LLM to manage approximately 5 concurrent processes. I haven't observed any concurrency settings within the Basic LLM Chain or OpenAI Chat Model configurations.

The previous suggestion regarding concurrency management via RabbitMQ is intriguing. Are there particular examples available for this, and how can distinct tasks be restricted to utilize varying concurrency levels? This method would also necessitate the integration of supplementary external components. Is callin.io planning to introduce concurrency management features in the future? This would be highly beneficial for numerous application scenarios, such as API calls (for instance, to LLMs).

 
Posted : 03/01/2024 5:25 am
xuxiang
(@xuxiang)
Posts: 3
Active Member
Topic starter
 

Is this approach comparable to initiating a workflow via a webhook?

 
Posted : 03/01/2024 5:30 am
system
(@system)
Posts: 332
Reputable Member
 

This thread was automatically closed 90 days following the last response. New replies are no longer permitted.

 
Posted : 02/04/2024 5:31 am
Share: