Regeneration touching all files, very large project getting slower

Posts   
 
    
happyfirst
User
Posts: 215
Joined: 28-Nov-2008
# Posted on: 07-Aug-2017 15:17:51   

LLBL 4.2 - adapter - sql server - Database 1st

My project has gotten very large, 828 entities, 501 typed views (that I don't want), and I've noticed that after a catalog refresh and code regeneration (done from command line batch file), my visual studio (that is open when I do this), bogs down my pc (pegged 100%) for some number of minutes (and I have a very fast OCd machine, ssds, etc).

I have emit timestamps off, yet I notice the timestamp on ALL the files is changing even though the contents themselves did not change and version control doesn't see a change.

My guess is VS is detecting a file change and reprocessing all the files behind the scenes. I can't tell if it's bogged down because it's busy checking all the files or just because a few key but very large files (like FieldCreationClasses,FieldInfoProvider,etc) have changed.

1) Is there any option I can turn on to have it be smarter and not keep changing those files timestamps even though they haven't changed?

2) Any options to get those very large files to be broken up into individual files?

3) I am using a command line to regenerate and am wanting it to take everything from my db. If I add a new table, view, etc, it is auto mapping it which is what I want. But I've also ended up with a bunch of typed views I don't want. Any way to get it to auto map new tables and views (just as entities), but not have it also generate typed views? (that all end up with a 1 on the end of their name)

Thanks!

Otis avatar
Otis
LLBLGen Pro Team
Posts: 39590
Joined: 17-Aug-2003
# Posted on: 07-Aug-2017 17:39:54   

happyfirst wrote:

LLBL 4.2 - adapter - sql server - Database 1st

My project has gotten very large, 828 entities, 501 typed views (that I don't want), and I've noticed that after a catalog refresh and code regeneration (done from command line batch file), my visual studio (that is open when I do this), bogs down my pc (pegged 100%) for some number of minutes (and I have a very fast OCd machine, ssds, etc).

I have emit timestamps off, yet I notice the timestamp on ALL the files is changing even though the contents themselves did not change and version control doesn't see a change.

My guess is VS is detecting a file change and reprocessing all the files behind the scenes. I can't tell if it's bogged down because it's busy checking all the files or just because a few key but very large files (like FieldCreationClasses,FieldInfoProvider,etc) have changed.

1) Is there any option I can turn on to have it be smarter and not keep changing those files timestamps even though they haven't changed?

It simply overwrites the files, so they appear newer (but the contents doesn't change). If you want to have the generator ignore files which are already there, you can do that in the preset you're using. Add to the task which generates files e.g. the task which generates the entity classes: <parameter name="failWhenExistent" value="true" />

in its <parameters> element.

this will make the code generator skip the file if it already exists. There's a catch: you shouldn't do this with all tasks, as some files change when you e.g. add a new entity, like the persistenceinfoprovider class.

2) Any options to get those very large files to be broken up into individual files?

You mean the infoprovider classes? No, sorry. 800+ entities is rather large though, so expect some overhead from that. If your model however has several disconnected submodels (or sub models which e.g. use an entity E only in readonly fashion, so you can copy it) you can think of cutting it up into groups in the designer and change the project setting for group usage, set it to 'As separate projects' which will generate a separate project per group, which are smaller. I doubt your 800+ entity model is one connected graph, that's almost never the case.

3) I am using a command line to regenerate and am wanting it to take everything from my db. If I add a new table, view, etc, it is auto mapping it which is what I want. But I've also ended up with a bunch of typed views I don't want. Any way to get it to auto map new tables and views (just as entities), but not have it also generate typed views? (that all end up with a 1 on the end of their name) Thanks!

That option isn't present, sorry: we have options for 'Add new elements after relational model data sync' (which is refresh in v4.2) which add new elements as new model elements like tables and typed views, and there's a setting 'add new views as entities after relational model data sync' (refresh in v4.2), which partly covers what you want.

If you switch off the 'add new elements after...' setting and enable the 'add new views as entities'... setting, you partly get what you want, and you can new tables in the designer as entities by right-clicking the database node in catalog explorer and select 'reverse engineer tables to entities'. this will split tables which are already mapped and tables which aren't mapped so you can easily add the new ones without getting duplicates as they're grouped in two different groups in the dialog that pops up.

I know it's not on the command line, but it's IMHO easier than dealing with 500+ mapped elements you don't want.

Frans Bouma | Lead developer LLBLGen Pro
happyfirst
User
Posts: 215
Joined: 28-Nov-2008
# Posted on: 10-Aug-2017 16:28:49   

Thanks. that was what I was worried about.

I guess I have a few ideas I will have to find time to experiment with. I really like how today the command line batch is doing everything I need, but unfortunately giving me all that excess baggage with the typed views.

I had thought about the groups. It is not one massive connected graph. But then, many times, would still be generating multiple projects and I don't want to have to think about which project or projects to regen.

I will have to experiment with your suggested settings changes. I may also possibly write a simple program to remove the TypedViewDefinitions and mappings from the llblgen file before executing the code generator. Would be nice to be able to better control this in a future release.

I want to try and figure out what is killing VS. All the timestamps changing or just those few key but very large files.

Otis avatar
Otis
LLBLGen Pro Team
Posts: 39590
Joined: 17-Aug-2003
# Posted on: 10-Aug-2017 17:22:28   

My guess would be the files which are all changed and it tries to reload them all. Other than that, if you want to check it out, you need to run a profiler on DevEnv.exe but that will likely take forever to complete...

Frans Bouma | Lead developer LLBLGen Pro