TMSFNCCloudAI with Local LLM

LM Studio can load many LLMs in local and has an "OpenAI"-like API available with a subset of procedures exposed.

I need to know if TMSFNCCloudAI can connect to any local IP/localhost using aiOpenAI Service option ? If not, I think you should really test their application for further implementation, it's definitely a great software to interface with...

Yes it can.
The Ollama LLM service we support is based on a local running Ollama server.

thanks for the answer, i'm definitely going to test it now.