In TMyEntity, I have a field Document of type TBlob. I am displaying TMyEntity objects in a grid. Via a button I want to display/load the blob content (a PDF file) in a dialog box. The dialog box contains a pdf viewer (pdfvDocument).
When I execute the following code:
procedure TfrmExpenses.expensesDocumentExecute(Sender: TObject);
var
LMyEntity: TMyEntity;
dlgDoc: TdlgDocument;
msPdf, msSave: TMemoryStream;
begin
inherited;
dlgDoc := TdlgDocument.Create(self);
msPdf := TMemoryStream.Create;
try
LMyEntity := adsMyEntities.Current<TMyEntity> as TMyEntity; // adsMyEntities is an TAureliusDataSet
LMyEntity.Document.SaveToStream(msPdf); // THIS RAISES THE EXCEPTION
// adsMyEntitiesDocument.SaveToStream(msPdf);
dlgDoc.pdfvDocument.LoadFromStream(msPdf);
if dlgDoc.ShowModal = mrOK then
begin
msSave := TMemoryStream.Create;
try
dlgDoc.pdfvDocument.Document.SaveToStream(msSave);
LMyEntity.Document.LoadFromStream(msSave);
// adsMyEntitiesDocument.LoadFromStream(msSave);
finally
msSave.Free;
end;
end;
finally
msPdf.Free;
dlgDoc.Free;
end;
I get the error message "Could not retrieve aliased column name in criteria"....
I am puzzled by this as I don't understand what criteria. The TAureliusDataSet is simply linked to a TDataSetField of another TAureliusDataSet (i.e. it's a child table).
What am I overlooking / doing wrong? How to save to/load from the viewer via a TStream?
I believe the error happens because the blob is lazy-loaded, thus it's the SQL that is trying to retrieve the blob content.
If you have a small project reproducing the issue, please send it to us so we can run it and debug at our side.
Alternatively, you can try to use the OnSqlExecuting event to get the exact SQL statement that Aurelius is trying to execute, so we can see the SQL and guess what might be wrong with it.
Allow me some time to build a small example project.
I have tried to log the SQL statements but they all appear OK. As expected, non of the statements contain 'Document' to include the BLOB field (i.e. lazy loading). However, when attempting to execute above code (blob to pdf viewer, so attempting to read the blob), the exception is raised but no sql is logged (i.e. it does not seem to get to the point where it would load the blob content).
This application is built with Interbase 2020. Can the example project be with Interbase (I'll include DML statements) or do you prefer SQLite?
Built a test application (attached) and sure enough.... it works
Attached the project and sample DB (with some data) for you to review (FWIW). In my real application, it raises the exception when attempting to write to the stream.
Any suggestions as to what may cause the "cryptic error message" (some hints on where to look first)?
Otherwise I'll start redefining the dataset fields, perhaps something went wrong there....
Before I check your project, can you please send the call stack at the time of the exception you are getting? I'm suspecting this might not be coming from the blob loading itself.
For good order: the sample project does NOT raise an exception when I write to stream. My real application (using Interbase 2020) raises an exception as soon as I "touch" the blob field. First I thought this was only when attempting to write to a stream, but even if I just perform something like if MyEntity.BlobField.IsNull I get an exception as well...
Two screenshots attached: one of call stack and one of procedure that raise the exception (assert failure).
I just noticed something. In my entity I modified (via data modeler script) some of the property getters and setters. The entity is also used with a filter active. See attached definition of the entity (txt file). Filter 'BookingExpenses' is active when the problem code is executed. Also see 7th line from top in call stack....
Associations are not supported in global filters. Does the error happen when the filter is disabled? If not, then the problem is you're using this unsupported Ledger.Ledger construction.
I cannot test until tomorrow, but will revert asap. Please note that I have run the application (personal use only) without any issue using the filterdef "as is". The problem started only when touching the blob field (adding a feature).
Not sure how to use the {$alias...} construct. Can you give a slightly more detailed example so I can try that as well? TIA!
The {$alias} macro just adds the table alias used by Aurelius in the SQL statement. The code snippet that you pass in FilterDef attribute is raw SQL added to the SQL statement generated by Aurelius. If you only add a text like this:
[FilterDef('BookingRevenue', 'LedgerId < 1000')]
You might have problems if joined tables also have the LedgerId column, the RDBMS will complain that the column reference is ambiguous.
Thus, you use the {$alias} macro to tell Aurelius that it should prefix the column with the alias of the table associated with the entity being filtered:
I get the exception 'Column unknown. A.LEDGER'. Perhaps I need to place the { and/or } differently? Please add something to the online documentation as to me this was a pretty obscure item. I couldn't find anything other then the "What's New"....
In the meantime I thought of a workaround for my streaming problem. I am not showing anything of the blob field (storing a PDF) in my form, it simply has to be streamed to a modal dialog.
So I get the ID of the current entity in the dataset (grid) and create a new ObjectManager with no filter enabled. With this ObjectManager, I retrieve the entity for that ID, stream the blob and flush the entity when done. The ObjectManager (apparently is lightweight enough) to do this successfully without a noticable delay.
Now on to solving issues streaming to/from the (DevExpress) PDFViewer.... :-(
My bad. I was mixing up on entity and table. The Booking entity has a TLedger property whilst this is referenced by the LedgerId field in the Booking table. Using LedgerId indeed works without error.
However, this does not work for me, as I don't need filtering on the LedgerId field (record ID) but the on the Ledger field. The Ledger field is the name (or code) for the Ledger, displayed to the user. The table looks something like:
ID Ledger Description
1 1000 Some ledger
2 1012 Some other ledger
I hope this makes sense. I now understand the issue and will continue using the workaround with the temp Object Manager. It's clean, simple and it works.