Skip to content
Working with the ne...
 
Notifications
Clear all

Working with the new OpenAI GPT-3.5-16k model

11 Posts
3 Users
0 Reactions
4 Views
ManyQuestions
(@manyquestions)
Posts: 6
Active Member
Topic starter
 

Open AI has just released an updated version of the GPT 3 & 4 model. How can it be implemented?

I'm aware that it's already available in the OpenAI node. However, I'm wondering how to invoke a function.

Does anyone have any suggestions?

 
Posted : 16/06/2023 6:14 am
Jon
 Jon
(@jon)
Posts: 96
Trusted Member
 

Hi there,

We do not currently have a function option within the node. I've just created an internal development ticket to implement this feature. However, it might not be a straightforward modification, as it appears the function needs to be integrated directly into the node. This could involve adding a dedicated function field and developing a method for sandboxing it, much like the code node.

Our internal tracking reference for this is NODE-596.

 
Posted : 16/06/2023 6:53 am
ManyQuestions
(@manyquestions)
Posts: 6
Active Member
Topic starter
 

Thanks a lot. Is there a workaround, perhaps by using a code node combined with an HTTP request node?

 
Posted : 16/06/2023 7:05 am
Jon
 Jon
(@jon)
Posts: 96
Trusted Member
 

Hello,

Looking at the API example, you could utilize an HTTP Request node to send the request. However, it appears the function needs to be executed on the response. You might be able to achieve this using just a code node if you're feeling adventurous.

 
Posted : 16/06/2023 7:07 am
ManyQuestions
(@manyquestions)
Posts: 6
Active Member
Topic starter
 

I feel like I know how to do it, but it's a different story.

:sweat_smile:

I suppose it's a combination of this:

And this:

But yes, I could ask chat GPT to combine them...

:joy:

 
Posted : 16/06/2023 7:11 am
ManyQuestions
(@manyquestions)
Posts: 6
Active Member
Topic starter
 

The example above uses GPT-4, but I suspect GPT-3.5 would be more practical.

 
Posted : 16/06/2023 7:16 am
Jon
 Jon
(@jon)
Posts: 96
Trusted Member
 

Hello,

It will be identical for both; the only difference would be the model. However, remember that it would need to be JavaScript, and you would be sending the HTTP request from the code node, so you'd have to include some imports.

 
Posted : 16/06/2023 7:23 am
ManyQuestions
(@manyquestions)
Posts: 6
Active Member
Topic starter
 

Apologies for the delayed response. So, how can I integrate this into callin.io?

 
Posted : 22/06/2023 7:28 pm
Jon
 Jon
(@jon)
Posts: 96
Trusted Member
 

Hey,

You would need to utilize a code node and manually construct your request to interact with the API, following the documentation provided by OpenAI. This might be challenging and require some effort, as we currently lack a specific example for this process.

However, we do have this PR available: feat(OpenAI Node): Support for functions by michael-radency · Pull Request #6508 · n8n-io/n8n · GitHub

This will introduce function support to the OpenAI node. I don't have a precise timeline for its release, as it depends on the internal review process, but I anticipate it will be included in a new release within the next week or two. If you require function support sooner, you would need to revert to using the code node or perhaps build a custom image from this PR to test it out.

 
Posted : 23/06/2023 6:13 am
ManyQuestions
(@manyquestions)
Posts: 6
Active Member
Topic starter
 

No thank you. The timeframe is sufficient. We appreciate your work!!

 
Posted : 23/06/2023 8:12 am
system
(@system)
Posts: 332
Reputable Member
 

This discussion was automatically closed 90 days following the last response. New responses are no longer permitted.

 
Posted : 21/09/2023 8:12 am
Share: