Hi,
I use a TAureliusDataset with run-time created fields because I need a few calculated fields. When I create the fields in the TAureliusDataset.BeforeOpen event the operation SetSourceCriteria (that was called earlier) no longer works for this dataset. Even when the criteria returns a number of objects (records) the dataset will remain empty when opened.
A workaround for this is to use SetSourceList instead of SetSourceCriteria (with the same criteria).
Can someone confirm this is a bug in Aurelius?
Logic flow:
oAureliusDataset.Active := false;
oAureliusDataset.SetSourceCriteria(..any criteria with a result will do..);
oAureliusDataset.Active := true;
Now, before the dataset is active the BeforeOpen event fires:
// Create the data fields.
oAureliusDataset.FieldDefs.Update;
TUnprotectedDataset(oAureliusDataset).CreateFields;
// Create a calculated field.
oField := TStringField.Create(oAureliusDataset);
oField.Name := 'fldMyFieldName';
oField.FieldName := 'MyFieldName';
oField.DataSet := oAureliusDataset;
After this the dataset is open but empty (recordcount = 0, fields are valid).
If using SetSourceList instead of SetSourceCriteria the dataset contains data.
Scott
Calling FieldDefs.Update will in turn call Close which in turn destroys the SourceCriteria internally kept. That's how it is, you have to keep it in mind that closing the dataset destroys the criteria, it's not permanent.
This design implies that the following combination is not possible:
- TAureliusDataset
- Data fields combined with calculated fields.
- Large resultset that need to be limited by a Criteria (with given max)
It's true that FieldDefs.Update will eventually call Close but if you look closely at my example you see that the Update is called from the BeforeOpen event. Meaning that the dataset is not yet opened and thus also not really closed. Therefor the TAureliusDataset destroys the criteria for no reason.
These changes in TAureliusDataset fixes the problem:
procedure TBaseAureliusDataset.InternalClose;
begin
if FIsOpen then
begin
FIsOpen := false;
FSourceCursor := nil;
if FSourceCriteria <> nil then
begin
FSourceCriteria.Free;
FSourceCriteria := nil;
end;
BindFields(false);
{$IFDEF DELPHIXE6_LVL}
DestroyFields;
{$ELSE}
if DefaultFields then
DestroyFields;
{$ENDIF}
// Free internal buffers
FCurrent := -1;
end;
end;
TAureliusDataset.Destroy should also be updated to prevent memory leaks:
if FSourceCriteria <> nil then
begin
FSourceCriteria.Free;
FSourceCriteria := nil;
end;
or the shorter version:
FreeAndNil(FSourceCriteria);
Scott
Correction: TBaseAureliusDataset.Destroy
Thanks Wagner, but no need for a patch here, the current workaround is okay for now.
I agree with the best solution being the criteria persistent between dataset close and open actions. That's what I expected in the first time. If you compare it with a TQuery, the SQL statement is never cleared when the query is closed and the SQL is re-executed when the query is opened again.
Will this be updated in a next version or does it break current implementations?
A property "PersistentCriteria" can be a solution for this.
The main difference is that SourceCriteria cannot be serialized (for now) and it's not a "representation" (like SQL) but an object with state, pointers, etc.. So it could be more dangerous to keep it alive between dataset open/close.
Theoretically more dangerous yes. Using interface pointers instead of object pointers would eliminate this risk. But hard to do if you depend on generics. The main problem is having the ObjectManager destroyed while the criteria is still alive i guess, but this is also a risk with active Aurelius datasets in the current implementation.
A notifier pattern for the objectmanager could be used to signal the dependencies (like TCriteria) that it is destroying.
My vote for serializing TCriteria.