It would be fine for me, if I could set a list of schemas in TDatabaseManager that should be checked.
So TDatabaseManager would only check objects that are declared in given schemas.
For sure, I could.
But there could be tables that I don't know, and I do not work with.
So plugins could create database-objects within several schemas.
I will not have any classes for those tables and would like to set AllowDestructiveCommands to true.
That's why I could not accomplish this case with multi-model.
Sorry, I don't understand why multi-model won't help. Can you give a concrete example? You would just flag the models of each entity class, and then when using TDatabaseManager you tell it which models you want to be considered when updating the database. It's exactly the same thing as you asking for.
The big difference is, that I use AllowDestructiveCommands.
I'm already working with multi-model. I also do this by using my own MappingSetups.
While validating my database structure, I put all Entities in a new "big" MappingExplorer, so that every Entity will stay.
If not, AllowDestructiveCommands would, for sure, remove entities, that are not in current MappingExplorer.
for hClass in hMapExt.Get('DummyTables').Hierarchy.Classes
hBigMapSetup.RegisterClass(hClass);
for hClass in hMapExt.Get('ExampleTables').Hierarchy.Classes
hBigMapSetup.RegisterClass(hClass);
hDbMan := nil;
hBigMapExp := TMappingExplorer.Create(hBigMapSetup);
try
hDbMan := TDatabaseManager.Create(conn, hBigMapExp);
hDbMan.AllowDestructiveCommands := True; // so that removed Tables, Columns and everything else will be removed in database...
hDbMan.UpdateDatabase;
finally
hBigMapExp.Free;
hDbMan.Free;
end;
My Database could contain:
ExampleTables.Example
DummyTables.Dummy
DummyTables.Dummy2
ThirdPartyPlugin.UnknownTable
ThirdPartyPlugin.UnknownTable2
objects, that are part of ThirdPartyPlugin should not be removed, but at the end, they do not come from our code or mapping at all...
That's the problem with AllowDestructiveCommands. If you have something there that is not in your mapping, it will be destroyed.
You should manage the destructive commands yourself. You can get some help from Aurelius, actually, for that. For example, you can interact through the destructive drop table commands and check if the table being destroyed belongs to the schemas you want to deal with. If yes, then execute it.
This fits better to a more complex migration system than the current feature Aurelius provides. It won't stop on destructive commands, you might need data migration, field type changing, etc.
Ok, I got your point.
So I will keep going using TDatabaseManager for validation and getting statements.
But I will execute them by myself.
For this it would be helpful, if all methods starting with Action would be protected and virtual, instead of private.
So I could check against metadata instead of parsing strings...
Implementing this, I recognized, that SequenceMetadata do not know the schema they belongs to...
So executed Sequence-statements do not use any schema at all. So it is hard to check, if they belong to my schema or model...
Is this a big change for you and needs a feature request, or could this also be done by this ticket?