Skip to content
TypeError: Cannot r...
 
Notifications
Clear all

TypeError: Cannot read properties of undefined (reading 'content') - Troubleshooting Help

4 Posts
2 Users
0 Reactions
8 Views
NadamHL
(@nadamhl)
Posts: 3
Active Member
Topic starter
 

Describe the problem/error/question

Stack trace

TypeError: Cannot read properties of undefined (reading 'content') at ToolCallingAgentOutputParser._baseMessageToString (/usr/local/lib/node_modules/callin.io/node_modules/@langchain/core/dist/output_parsers/base.cjs:24:31) at ToolCallingAgentOutputParser._callWithConfig (/usr/local/lib/node_modules/callin.io/node_modules/@langchain/core/dist/output_parsers/base.cjs:49:32) at ToolCallingAgentOutputParser._callWithConfig (/usr/local/lib/node_modules/callin.io/node_modules/@langchain/core/dist/runnables/base.cjs:223:34) at processTicksAndRejections (node:internal/process/task_queues:95:5) at ToolCallingAgentOutputParser._streamIterator (/usr/local/lib/node_modules/callin.io/node_modules/@langchain/core/dist/runnables/base.cjs:165:9) at ToolCallingAgentOutputParser.transform (/usr/local/lib/node_modules/callin.io/node_modules/@langchain/core/dist/runnables/base.cjs:402:9) at RunnableSequence._streamIterator (/usr/local/lib/node_modules/callin.io/node_modules/@langchain/core/dist/runnables/base.cjs:1320:30) at RunnableSequence.transform (/usr/local/lib/node_modules/callin.io/node_modules/@langchain/core/dist/runnables/base.cjs:402:9) at wrapInputForTracing (/usr/local/lib/node_modules/callin.io/node_modules/@langchain/core/dist/runnables/base.cjs:275:30) at pipeGeneratorWithSetup (/usr/local/lib/node_modules/callin.io/node_modules/@langchain/core/dist/utils/stream.cjs:271:19) at RunnableLambda._transformStreamWithConfig (/usr/local/lib/node_modules/callin.io/node_modules/@langchain/core/dist/runnables/base.cjs:296:26) at wrapInputForTracing (/usr/local/lib/node_modules/callin.io/node_modules/@langchain/core/dist/runnables/base.cjs:275:30) at pipeGeneratorWithSetup (/usr/local/lib/node_modules/callin.io/node_modules/@langchain/core/dist/utils/stream.cjs:271:19) at RunnableLambda._transformStreamWithConfig (/usr/local/lib/node_modules/callin.io/node_modules/@langchain/core/dist/runnables/base.cjs:296:26) at RunnableSequence._streamIterator (/usr/local/lib/node_modules/callin.io/node_modules/@langchain/core/dist/runnables/base.cjs:1320:30)

What is the error message (if any)?

Please share your workflow

Share the output returned by the last node

Information on your callin.io setup

  • callin.io version: 1.91.2
  • Database (default: SQLite): sqlite
  • callin.io EXECUTIONS_PROCESS setting (default: own, main): main
  • Running callin.io via (Docker, npm, callin.io cloud, desktop app): docker
  • Operating system: centos 7.9
 
Posted : 07/05/2025 1:52 am
NadamHL
(@nadamhl)
Posts: 3
Active Member
Topic starter
 

I attempted to create a custom agent for a chat model, and this agent interacted with the custom agent, which in turn accessed either OpenAI's or Deepseek's chat model indirectly. The data formats returned from both the custom proxy and direct access to the chat model appear normal, but the content field is consistently empty. Below is the data returned by our access interface /v1/chat/completions:

json
{
"id": "943b2464-b3e4-49a6-a69c-dafdafa23e2",
"object": "chat.completion",
"created": 1746524189,
"model": "deepseek-chat",
"choices": [
{
"index": 0,
"message": {
"role": "assistant",
"content": "",
"tool_calls": [
{
"index": 0,
"id": "call_0_1c581ecb-a6a1-4d70-870a-dasfdasfa",
"type": "function",
"function": {
"name": "ai_security_knowledge_base_tool",
"arguments": "{"input":"[0000-0000]重点业务错误码100001的处理步骤"}"
}
}
]
},
"logprobs": null,
"finish_reason": "tool_calls"
}
],
"usage": {
"prompt_tokens": 520,
"completion_tokens": 41,
"total_tokens": 561,
"prompt_tokens_details": {
"cached_tokens": 512
},
"prompt_cache_hit_tokens": 512,
"prompt_cache_miss_tokens": 8
},
"system_fingerprint": "fp_8802369eaa_prod0425fp8"
}

 
Posted : 07/05/2025 1:57 am
NadamHL
(@nadamhl)
Posts: 3
Active Member
Topic starter
 

This is the primary logic for our proxy service implementation, designed to receive a response from Deepseek and relay it directly to the aiagent.

// Forwarding to Deepseek

func forwardToDeepseek(w http.ResponseWriter, req APIRequest) {
    body, err := json.Marshal(req)
    if err != nil {
        sendError(w, http.StatusInternalServerError, "Failed to marshal request")
        return
    }
    httpReq, _ := http.NewRequest("POST", "https://api.deepseek.com/v1/chat/completions", bytes.NewReader(body))
    httpReq.Header.Set("Authorization", "Bearer "+"sk-12312312dassdw123123")
    httpReq.Header.Set("Content-Type", "application/json")
    httpReq.Header.Set("Accept", "application/json")

    client := &http.Client{Timeout: 30 * time.Second}
    resp, err := client.Do(httpReq)
    if err != nil {
        sendError(w, http.StatusBadGateway, "OpenAI service error")
        return
    }
    defer resp.Body.Close()

    t, _ := ioutil.ReadAll(resp.Body)

    // Returning the string
    fmt.Fprintf(w, string(t))

}
 
Posted : 07/05/2025 2:03 am
For_Gm
(@for_gm)
Posts: 1
New Member
 

I'm experiencing the same.

Do you have a contract?

 
Posted : 02/06/2025 3:36 pm
Share: