TTMSFNCCloudAI - Questions about ContextWindow and Streaming

We are at the moment investigating on how to integrate AI in our applications in a way that is usefull for our users. The TTMSFNCCloudAI comes in handy, as it is easy to use within our applications.

There are two Questions that are not clear to me:

First ContextWindow/ChatHistory. When i wan't the context of the chat to be available to the KI for example using MIcrosoftSemanticKernel, there are two functions for this:
ChastHistory.AddUserMessage which adds the user prompt to the contextwindow and ChatHistory.AddAssistantMessage which adds the answer of the LLM to the contextwindow.

Is this equivalent to TTMSFNCCloudAI::Context::Add and TTMSFNCCloudAI::AssitantRole::Add?

Second: Is it planed to have a StreamingCompletion-Service? So that the user doesn't have to wait for the full answer to be processed before he sees a result?

Thank you!

With TTMSFNCCloudAI you can keep adding the history to TTMSFNCCloudAI.Context.
We are researching to make this also work with streamed responses in a future update.

1 Like