TMSFNCCloudAI Demo: Perplexity doesn't work

Hello,

I tried the TMSFNCCloudAI Demo and when I choose Perplexity, there was an error message:

{"error":{"message":"Invalid model 'llama-3.1-sonar-small-128k-online'. Permitted models can be found in the documentation at Overview - Perplexity":400}}

Then I noticed that the Settings/Perplexity model was set to : "llama-3.1-sonar-small-128k-online"
.... so I first set it to "sonar", then to "sonar-reasoning" but the error message was the same, it kept mentioning 'llama-3.1-sonar-small-128k-online'.

I searched this 'llama-3.1-sonar-small-128k-online' in the project and I didn't find it. Maybe I am mistaken bu it seems that this part is "hard-wired" into the software’s source code.

Can you check it, please? Thank you very much!

I retested this here with the code to use the model sonar-pro listed at

begin
  TMSFNCCloudAI1.APIKeys.Perplexity := KEY_PERPLEXITY;
  TMSFNCCloudAI1.Settings.PerplexityModel := 'sonar-pro';
  TMSFNCCloudAI1.Service := aiPerplexity;
  TMSFNCCloudAI1.Context.Text := 'list the last 5 presidents of the USA';
  TMSFNCCloudAI1.Execute('abc');
end;

and I could not see an issue.

Thank you Bruno,

That indeed works, thank you. However it still doesn't work in the shipped demo.
A lot of people (as I) start exploring the demo ... and if the demo's result is bad, they won't explore any more.
It never occured to me that the demo itself doesn't work for some reason, yet creating a new project with the same component works.

We will ensure that with the next update, the default Perplexity model is updated, also in demos.

1 Like