Large file upload over 2gb

Hi,
Currently I use FTP to allow users to upload large files to server. But this requires extra management and not well integrated to xdata server. Can xdata handle a very large file in a single chuck?

Thank you in advance for your suggestions on this matter.

I'd have thought it would, but would probably depend on the memory available on the server.

I have found some JS libraries to break a large file into chunks and upload but this again needs some serverside changes also. Also I believe that the file size during the transit will be enlarged due to encoding.

I like this approach - break it up into chunks and have the XData server reassemble the chunks. One of the benefits is that you then have the opportunity to provide a better progress indicator as you upload the chunks and can also provide a bit of fault tolerance or pause/resume/restart functionality as well.

1 Like

I found this GitHub - simple-uploader/Uploader: A JavaScript library providing multiple simultaneous, stable, fault-tolerant and resumable/restartable file uploads via the HTML5 File API.
I will implement the serverside and share the results.

2 Likes

Another option is to use the standard multipart/form-data MIME type. XData (more specifically, Sparkle) can receive requests sending such content-type, and it processes it in chunks to avoid using too much server memory:

https://doc.tmssoftware.com/biz/sparkle/guide/server.html#handling-multipart-content

2 Likes

Thanks for the Sparkle multipart link.

I found a more popular resumable, multipart uploder:

Attached is the Sparkle ResumableUpload.pas that works with resumable.js library.
This is how I use it from my Xdata service:

  [ServiceContract]
  IResumableUploadService = interface(IInvokable)
    ['{BE031A10-F2CD-459C-A977-DC56353CCC50}']

    //[Authorize]
    [httpget]
    function TestChunk(ResumableIdentifier, ResumableFilename: string; ResumableChunkNumber: Integer): string;

    //[Authorize]
    function UploadChunk(Value: TStream): string;
  end;

function TResumableUploadService.TestChunk(ResumableIdentifier, ResumableFilename: string; ResumableChunkNumber: Integer): string;
var
  UploadStatus: TResumableUploadStatus;
begin
  UploadStatus := ResumableUpload.TestChunk(ResumableIdentifier, ResumableFilename, ResumableChunkNumber);
  if UploadStatus.StatusCode <> 200 then
    raise EXDataHttpException.Create(UploadStatus.StatusCode, UploadStatus.StatusReason);
end;

function TResumableUploadService.UploadChunk(Value: TStream): string;
var
  ContentType: string;
  UploadStatus: TResumableUploadStatus;
begin
  ContentType := Context.Request.Headers.Get('content-type');
  UploadStatus := ResumableUpload.UploadChunk(Value, ContentType, 'C:\Users\test\Downloads', True);
  if UploadStatus.StatusCode <> 200 then
    raise EXDataHttpException.Create(UploadStatus.StatusCode, UploadStatus.StatusReason);
end;

This is the my test.html modified from resumable.js:

<script src="resumable.js"></script>
<script>
var r = new Resumable({
  chunkSize:15*1024*1024,
  target: '/UploadChunk',
  testTarget:'/TestChunk',
  simultaneousUploads:4
});

ResumableUpload.pas (5.8 KB)

3 Likes