I assume you are sending a TBlob value as a property of an object you are returning? In this case, yes, XData will base64-encode the full value. It's not suitable for big chunks of data, only small.
If you want to return big data, just create a service operation that returns a TStream.
The multipart reader can be used in XData, you can just create a service operation that receives a TStream as well.
No, my Result was from type TBlob. (but in other Endpoints they're properties of Objects)
My exact case is creating and restoring postgres-backups.
My client is sending a Rest-Command, so that XData-Server will create a backup from the choosen databases, sending back one file (at last step, I'm iterating through all Postgres-BackupFiles putting them in a 7z-Archive)
So I have several Endpoints, handling these big files. I changed all of them to TStream right now (Creating TMemoryStream in my Interfaces).
But now, XData-Client is raising that JSON-Converter could not be found for Pointer. (In case of TStream is empty.)
If TStream is not created at all, it looks like it's working fine.
But these Endpoints could not be used out of swagger, could them?
How would CRUD Endpoints behave in case of large TBlob contents? For exampe storing big images in Database...
You are probably not declaring the methods correctly. Please provide more info so we can know better what is happening.
How would CRUD Endpoints behave in case of large TBlob contents?
CRUD Endpoints are for records in databases. Do you have huge blob content in your database?
I mean, you have a record where a blob content is 950 MB?
That is a very unusual scenario. I would recommend you create specific service operations endpoints returning TStream.
Probably you are declaring the entity property as TStream instead of TBlob, that doesn't work.
My Endpoint got a function: TStream...
Constructor is creating TMemoryStream,
Destructor is freeing it.
Within calling this Endpoint, Stream is Copied into FStream of my implementing class.
But maybe, there is no stream fetched, so nothing is copied and this will lead to JSON-Converter Exception.
Not now, and I don't want to, but it could happen in far future, I just want to be prepared, what would happen.
That's what I thought of. But it would be nice to prohibit creating Base64 of my blob, because the datas could be mixed...
And for sure, Base64 is a lot easier to implement for other developers than chunks...
So maybe you could check, if Blob-Size is to big for base64 before converting it, sending a default text in this property like Blob too large instead of raising an error
IBackupRestore = interface(IInvokable)
function RestoreBackup(aBackup: TRestoreConfigList): TGUID;
this is my my Class:
TRestoreConfigList = class
function GetBackupFile: TStream;
destructor Destroy; override;
property BackupFile: TStream read GetBackupFile;
Error occurs, if Create is creating FBackupFile: TStream, that would maybe stay empty
As you can imagine taking a look at my definition, right now FBackupFile is created within GetBackupFile, if it is not Assigned right now.
This works fine, but I think it should also work if TStream stays empty
TRestoreConfigList is not an Entity. (if you mean an Aurelius-Entity corresponding to a table in Database).
It is just a Class holding some more Informations, just send a part of it.
As you could imagine by it's name, there is also a List, holding some Config-Informations (at least switches for pg_restore for each database)