Is there any way to use/chat with the "Projects" created with Claude Desktop?
Adding documents TMSMCPCloudAI1.UploadFile() reach a limit, which does not exists in Claude Projects.
At this moment there is not such built-in option, but we'll investigate if we can somehow accomplish that via their REST API that TMS AI Studio uses.
So far, we found no evidence it was possible from the REST API to use projects that were created with the Claude web interface.
Specs on Claude files usage we found are:
File Size and Storage Limits
Individual File Size:
- Maximum 500 MB per file
Total Storage:
- 100 GB per organization
Request Size:
- Maximum 32 MB for standard API endpoints (this applies to the entire request payload, including any content sent alongside files)
Supported File Types
The Files API supports different file types for different purposes:
| File Type | MIME Type | Content Block Type | Use Case |
|---|---|---|---|
application/pdf |
document |
Text analysis, document processing | |
| Plain Text | text/plain |
document |
Text analysis, processing |
| Images | image/jpeg, image/png, image/gif, image/webp |
image |
Image analysis, visual tasks |
| Datasets & others | Various | container_upload |
Data analysis for code execution tool |
What exact limit did you run into?
I have a Pro license with Claude.ai. Yesterday, every time I created a small chat related to an uploaded existing document, I received a "token" exceedance message. This doesn't happen in the Claude desktop application, within a project. Therefore, I suspect that the document size is playing a role in the chat; the chat alone can't be the cause.
When you would use TTMSMCPCloudAI.AddFile() it would play a role in the token size of the conversation, but with TTMSMCPCloudAI.UploadFile() it should keep the document outside the token size calculation and just bring an ID to the document in account.
So, if you use UploadFile() and a prompt, I'm not sure why you'd get a token size limit issue.
Yes, that's the way I use it.
I'm not sure then why Claude would complain about a token limit.