I'm using a nested TAureliusDataset with ParentManager = false.
The master dataset is being "loaded" through a SetSourceCriteria.
After I do master.edit , then I insert/Post some records on detail dataset. If the user wants to cancel, my code does a master.cancel. No objects persisted on details table, no master object changes persisted - that's ok.
But if I access the same master entity object again, the previous inserted (and supposedly cancelled) details () are there!
Must I manually delete and/or evict those unwanted details? I thought the nested structure would take care of this. Is it by design?
What am I doing wrong? Thanks.
It is by design because the detail dataset is adding/modifying objects directly in the detail list, not in a temporary buffer. A cancel in master dataset will only discard changes to temporary buffers in the dataset itself.
You will have to indeed manually revert the changes, one option is to do a refresh or reload the object so that the object data is updated again from the database.
I close the master dataset, then when I run a SetSourceCriteria searching for the same entity ID and reopen it , all the details are there (the ones previously saved, and the new ones that were thought to be discarded). Seems like they're handled by the manager and keep associated to the master object. I can even do a Post in master dataset, and all these details are persisted.
I'll try using .Refreshing in this source criteria. It may solve it, but I wonder if those detail objects could remain lost/orphan in manager.
Yes, you have to use .Refreshing to make sure everything is updated, otherwise the data in manager cache will be kept. But note Refreshing also has limitations in lists:
In this specific case it's safer that you handle everything manually. The objects created and not persisted should also indeed be destroyed. You can use Manager.AddOwnership upon object creation to make sure the manager will destroy them eventually, even if they are not persisted.