Can't see AI Assistant view in 6.0.1

Started by Darren Leggett, February 10, 2026, 01:40:10 PM

Previous topic - Next topic

Darren Leggett

I've done a clean install of 6.0.1 for testing purposes.  I am not seeing the AI Assistant view in either the client or webUI.  I can't find any documentation on how to configure the AI assistant.  Can anyone point me in the right direction for further information?

Alex Kirhenshtein


Darren Leggett

Thanks for the quick response with documentation.  I've configured a provider now (OpenAI) but I'm still not getting AI Assistant view. 
I've enabled the debug logging but the only message I'm getting in netxmsd.log to do with AI is

2026.02.10 15:01:00.645 *W* [ai.assistant       ] AI assistant disabled (no functions or skills registered)

Any ideas what I need to do next would be a great help.

Alex Kirhenshtein

check server log, on level 2 or higher there should be messages like this:

   nxlog_debug_tag(DEBUG_TAG, 2, L"%d global functions registered", static_cast<int>(s_globalFunctions.size()));
   nxlog_debug_tag(DEBUG_TAG, 2, L"%d skills registered", static_cast<int>(GetRegisteredSkillCount()));


are they non-zero?

Alex Kirhenshtein

also set debug tag ai.skills to 6 or higher and look for skill loading messages

Victor Kirhenshtein


justrest

We are experiencing the same issue. Please provide guidance. Thank you very much.

## Logging
# Log file name
LogHistorySize=7
LogRotationMode=1
LogFile=/var/log/netxmsd

# Increase logging verbosity, 0 (only errors) to 9 (verbose debug)
#DebugLevel=3

## Option #2 - PostgreSQL (recommended):
DBDriver=pgsql.ddr
DBServer=127.0.0.1
DBName=netxms_db
DBLogin=netxms
DBPassword=xm123456


[AI/Provider/openai]
Type = openai
URL = http://10.61.7.101:8008/v1/chat/completions
Model = qwen3-30b-a3b-instruct-2507
Token = axxxxxxxxxxxxx
Slots = default

Alex Kirhenshtein

make sure that you have

Module=aitools

in the core section of the netxmsd.conf

justrest

Thank you very much for your help. I would like to ask if NetXMS supports the Tongyi Qianwen model that complies with the OpenAI specification. The Qwen3-30B model received the chat content sent via AITools, but it finally prompted: Error: Cannot get response from AI assistant.

The content in the netxmsd log is as follows:
2026.02.11 17:26:37.135 *D* [ai.assistant       ] Using provider "qwen3" for slot "interactive"
2026.02.11 17:26:37.135 *D* [llm.chat           ] Added message to chat: role="user", content="server uptime"
2026.02.11 17:26:37.136 *D* [client.session.0   ] Sending compressed message CMD_OBJECT_UPDATE (1920 bytes)
2026.02.11 17:26:37.137 *D* [ai.provider        ] Sending request to http://10.61.7.101:8008/v1/chat/completions: {"model": "qwen3-30b-a3b-instruct-2507", "stream": false, "messages": [{"role": "system", "content": "You are

Filipp Sudanov

Should be supported. Pls try adding this to main section of server config:

DebugTags=ai.provider:8,ai.prov.openai:8,llm.chat:8,ai.assistant:8

and share server log file

Victor Kirhenshtein

I've found the issue. When no tool calls needed, your model returns tool_calls as empty array, but server code only checks for missing or null tool_calls. Just fixed that, upcoming patch release should work as expected.

justrest

Wow, that's awesome, thank you so much!