Copying an Entity

Posts   
 
    
Carl
User
Posts: 7
Joined: 10-Mar-2006
# Posted on: 10-Mar-2006 17:29:01   

I am working for a client who uses LLBLGEN PRO to create question and answer information for forms filled out online. If a user wants to go back and change the data in a form (after it has been validated & saved) then they must do this with a copy of the original.

There are a number of different forms (therefore a number of different entities). There are varying complexities; for example a simple form has (base) form table fields and the fields from another table 1:1. A more complex form has, for example, 1:n applicants and each applicant has several addresses, each address relates to a country etc. Try to imagine the worst kinds of forms with many questions inside many sections...

I need to be able to write a method that provides a duplicate of the original entity (form)for any Entity. The new Entity must create new records in the database (except where reference data tables such as country are used).

I have seen previous posts about cloning fields and field cloning works ok, then setting Entity.IsNew = true to cause a save to perform a SQL INSERT - but I am struggling with a suitable strategy for examining the relationships that exist utilising the Collection classes. I am also struggling with GetDependentRelatedEntities() and GetDependingRelatedEntities() (I think this may be due to the Lazy Loading but I can't specify the prefetch parameters because I need to deal with a generic object).

Essentially I need to clone an entity with all of its relationships but with new PKs in a generic way that avoids knowing about the specific structure of each entity.

Can anyone point me in the right direction please?

sparmar2000 avatar
Posts: 341
Joined: 30-Nov-2003
# Posted on: 11-Mar-2006 18:56:21   

Carl What are you using - Self service or Adaptor?

Carl
User
Posts: 7
Joined: 10-Mar-2006
# Posted on: 13-Mar-2006 10:41:21   

Hi Sparmar

We are using Self Servicing

sparmar2000 wrote:

Carl What are you using - Self service or Adaptor?

Otis avatar
Otis
LLBLGen Pro Team
Posts: 39588
Joined: 17-Aug-2003
# Posted on: 14-Mar-2006 11:17:55   

Carl wrote:

I am working for a client who uses LLBLGEN PRO to create question and answer information for forms filled out online. If a user wants to go back and change the data in a form (after it has been validated & saved) then they must do this with a copy of the original.

I fail to see why they need a copy of the original. Can't they keep on working with the saved and validated version? that way, you can simply re-save the entity afterwards, which will result in an update of the already saved data.

There are a number of different forms (therefore a number of different entities). There are varying complexities; for example a simple form has (base) form table fields and the fields from another table 1:1. A more complex form has, for example, 1:n applicants and each applicant has several addresses, each address relates to a country etc. Try to imagine the worst kinds of forms with many questions inside many sections...

I need to be able to write a method that provides a duplicate of the original entity (form)for any Entity. The new Entity must create new records in the database (except where reference data tables such as country are used).

I have seen previous posts about cloning fields and field cloning works ok, then setting Entity.IsNew = true to cause a save to perform a SQL INSERT - but I am struggling with a suitable strategy for examining the relationships that exist utilising the Collection classes. I am also struggling with GetDependentRelatedEntities() and GetDependingRelatedEntities() (I think this may be due to the Lazy Loading but I can't specify the prefetch parameters because I need to deal with a generic object).

Essentially I need to clone an entity with all of its relationships but with new PKs in a generic way that avoids knowing about the specific structure of each entity.

Can anyone point me in the right direction please?

Could you elaborate a bit why you need to insert changes on the original data as new rows instead of updates ?

Frans Bouma | Lead developer LLBLGen Pro
Carl
User
Posts: 7
Joined: 10-Mar-2006
# Posted on: 14-Mar-2006 12:29:49   

Hi Otis

It is a business requirement, quotations are made against the information supplied. If the data is simply updated then there is no record of the details against which the quotation was made. I am sure that you could think of alternative ways to handle this by saving the details into some other form when a quotation is made - however creating a copy of the data is the way that has been chosen by my client. I have encountered a similar requirement when working for a big UK Government form filling project (although they were using plain DataSets & stored procedures).

Otis wrote:

Carl wrote:

I am working for a client who uses LLBLGEN PRO to create question and answer information for forms filled out online. If a user wants to go back and change the data in a form (after it has been validated & saved) then they must do this with a copy of the original.

I fail to see why they need a copy of the original. Can't they keep on working with the saved and validated version? that way, you can simply re-save the entity afterwards, which will result in an update of the already saved data.

There are a number of different forms (therefore a number of different entities). There are varying complexities; for example a simple form has (base) form table fields and the fields from another table 1:1. A more complex form has, for example, 1:n applicants and each applicant has several addresses, each address relates to a country etc. Try to imagine the worst kinds of forms with many questions inside many sections...

I need to be able to write a method that provides a duplicate of the original entity (form)for any Entity. The new Entity must create new records in the database (except where reference data tables such as country are used).

I have seen previous posts about cloning fields and field cloning works ok, then setting Entity.IsNew = true to cause a save to perform a SQL INSERT - but I am struggling with a suitable strategy for examining the relationships that exist utilising the Collection classes. I am also struggling with GetDependentRelatedEntities() and GetDependingRelatedEntities() (I think this may be due to the Lazy Loading but I can't specify the prefetch parameters because I need to deal with a generic object).

Essentially I need to clone an entity with all of its relationships but with new PKs in a generic way that avoids knowing about the specific structure of each entity.

Can anyone point me in the right direction please?

Could you elaborate a bit why you need to insert changes on the original data as new rows instead of updates ?

JSobell
User
Posts: 145
Joined: 07-Jan-2006
# Posted on: 14-Mar-2006 13:21:20   

It sounds as though he wants to save a copy for auditing purposes, so the new version never overwrites the previous.

Carl, this should be solved using your database design rather than by fiddling with object versions in memory. It is simple to copy your entire answer (form) hierachy, although this is obviously wasteful if the user only modifies a single item.

One simple solution would be to use a stored procedure to create a copy of the specified 'parent' and all of the related child tables, then return the new ID and use that to retrieve and populate your editable entities, effectively creating a 'template' start entity that the user will edit.

Another method I've used is to write a trigger that moves old versions of edited records into a history table, then I create a view that does a 'union all' to treat them as a single table for analysis purposes, and this approach is very flexible when auditing is required. Here's a simple example that I use for teaching purposes:

Create a Users table:


CREATE TABLE [dbo].[Users](
    [UserID] [int] IDENTITY(1,1) NOT NULL,
    [UserName] [nchar](10) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
    [FullName] [nchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
    [Extension] [nchar](10) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
    [tValidFrom] [datetime] NOT NULL CONSTRAINT [DF_Users_tValidFrom]  DEFAULT (getdate()),
    [tValidTo] [datetime] NOT NULL CONSTRAINT [DF_Users_tValidTo]  DEFAULT ('1 jan 2999'),
    [tVersion] [int] NOT NULL CONSTRAINT [DF_Users_tVersion]  DEFAULT ((0)),
 CONSTRAINT [PK_Users] PRIMARY KEY CLUSTERED 
(
    [UserID] ASC
)WITH (IGNORE_DUP_KEY = OFF) ON [PRIMARY]
) ON [PRIMARY]

and a history table to hold older versions:


CREATE TABLE [dbo].[History_Users](
    [UserID] [int] NOT NULL,
    [UserName] [nchar](10) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
    [FullName] [nchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
    [Extension] [nchar](10) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
    [tValidFrom] [datetime] NOT NULL,
    [tValidTo] [datetime] NOT NULL,
    [tVersion] [int] NOT NULL CONSTRAINT [DF_History_Users_tVersion]  DEFAULT ((0))
) ON [PRIMARY]

The Users table then has the following triggers added:


CREATE TRIGGER [Users_Update] 
   ON  [dbo].[Users] 
   FOR UPDATE
AS 
BEGIN
    SET NOCOUNT ON;

    DECLARE @now DATETIME
    SET @now = GETDATE()

    INSERT INTO History_Users
    SELECT DELETED.UserID, DELETED.UserName, DELETED.FullName, DELETED.Extension, DELETED.tValidFrom, @now, DELETED.tVersion
    FROM Users
    INNER JOIN DELETED ON Users.UserID = DELETED.UserID

    UPDATE Users SET tValidFrom = @now, Users.tVersion=Users.tVersion+1
    FROM Users
    INNER JOIN INSERTED ON Users.UserID = INSERTED.UserID

END


ALTER TRIGGER [Users_Delete] 
   ON  [dbo].[Users] 
   AFTER DELETE
AS 
BEGIN
    SET NOCOUNT ON;

    DECLARE @now DATETIME
    SET @now = GETDATE()

    INSERT INTO History_Users
    SELECT UserID, UserName, FullName, Extension, tValidFrom, @now, tVersion
    FROM DELETED

END

As you can see, it even keeps a current version number so you can track changes by date or version.

Unfortunately, as child elements are persisted before parents, it is not straighforward identifying which version of child objects relate to which version of a parent (as GetDate() may return slightly different values between updates), so this solution is best used only for auditing and simple version tracking, but is very useful at reducing code complexity and the need for stored procedures for insert/update/deletions. I suppose you could alter the update trigger for child tables to read the version from the parent element, but this could cause problems if you modify a child on its own, so for your requirements I suspect the stored proc would be best.

Cheers, Jason

Carl
User
Posts: 7
Joined: 10-Mar-2006
# Posted on: 14-Mar-2006 17:50:54   

Hi Jason

Thanks for your very useful suggestion, it will fit very well with another project that I am working on. I am not in disagreement with you with regard to better solutions - however my client (I am merely an on-site consultant) has already made some fundamental decisions. Using LLBLGEN and an OO approach to dealing with the data being some of them. The requirement placed upon me is to produce a method that, given a LLBLGEN EntityBase derived object, can create a duplicate copy (including all of the EntityFields and EntityCollectionBase objects recursively). We don't want to have to write new code every time that either (a) a new data object is born or (b) an existing one is modified. All that I am really asking for is how to implement a full and deep Clone() kind of function that clones the whole object graph but where I can set IsNew everywhere to create new records when a Save() is called. I have tried various ways, using Reflection to examine the properties, using the Fields collection, using Fields.Clone, IsNew(), IsDirty() and so on. However I have not been able to create a satisfactory solution. The closest that I got was to be able to create a new object that, somewhere in the process, updated the original object so that related entities then referred to the new object.

I am sure that you are all overloaded just like me - I don't want to burden anyone. I am just looking for ideas or previous experience with this. Otis, maybe you can describe how I can do it easily? Or if you think it is a No-Way just tell me "No Way" confused .

Many thanks

JSobell wrote:

It sounds as though he wants to save a copy for auditing purposes, so the new version never overwrites the previous.

Carl, this should be solved using your database design rather than by fiddling with object versions in memory. It is simple to copy your entire answer (form) hierachy, although this is obviously wasteful if the user only modifies a single item.

One simple solution would be to use a stored procedure to create a copy of the specified 'parent' and all of the related child tables, then return the new ID and use that to retrieve and populate your editable entities, effectively creating a 'template' start entity that the user will edit.

Another method I've used is to write a trigger that moves old versions of edited records into a history table, then I create a view that does a 'union all' to treat them as a single table for analysis purposes, and this approach is very flexible when auditing is required. Here's a simple example that I use for teaching purposes:

Create a Users table:


CREATE TABLE [dbo].[Users](
    [UserID] [int] IDENTITY(1,1) NOT NULL,
    [UserName] [nchar](10) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
    [FullName] [nchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
    [Extension] [nchar](10) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
    [tValidFrom] [datetime] NOT NULL CONSTRAINT [DF_Users_tValidFrom]  DEFAULT (getdate()),
    [tValidTo] [datetime] NOT NULL CONSTRAINT [DF_Users_tValidTo]  DEFAULT ('1 jan 2999'),
    [tVersion] [int] NOT NULL CONSTRAINT [DF_Users_tVersion]  DEFAULT ((0)),
 CONSTRAINT [PK_Users] PRIMARY KEY CLUSTERED 
(
    [UserID] ASC
)WITH (IGNORE_DUP_KEY = OFF) ON [PRIMARY]
) ON [PRIMARY]

and a history table to hold older versions:


CREATE TABLE [dbo].[History_Users](
    [UserID] [int] NOT NULL,
    [UserName] [nchar](10) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
    [FullName] [nchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
    [Extension] [nchar](10) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
    [tValidFrom] [datetime] NOT NULL,
    [tValidTo] [datetime] NOT NULL,
    [tVersion] [int] NOT NULL CONSTRAINT [DF_History_Users_tVersion]  DEFAULT ((0))
) ON [PRIMARY]

The Users table then has the following triggers added:


CREATE TRIGGER [Users_Update] 
   ON  [dbo].[Users] 
   FOR UPDATE
AS 
BEGIN
    SET NOCOUNT ON;

    DECLARE @now DATETIME
    SET @now = GETDATE()

    INSERT INTO History_Users
    SELECT DELETED.UserID, DELETED.UserName, DELETED.FullName, DELETED.Extension, DELETED.tValidFrom, @now, DELETED.tVersion
    FROM Users
    INNER JOIN DELETED ON Users.UserID = DELETED.UserID

    UPDATE Users SET tValidFrom = @now, Users.tVersion=Users.tVersion+1
    FROM Users
    INNER JOIN INSERTED ON Users.UserID = INSERTED.UserID

END


ALTER TRIGGER [Users_Delete] 
   ON  [dbo].[Users] 
   AFTER DELETE
AS 
BEGIN
    SET NOCOUNT ON;

    DECLARE @now DATETIME
    SET @now = GETDATE()

    INSERT INTO History_Users
    SELECT UserID, UserName, FullName, Extension, tValidFrom, @now, tVersion
    FROM DELETED

END

As you can see, it even keeps a current version number so you can track changes by date or version.

Unfortunately, as child elements are persisted before parents, it is not straighforward identifying which version of child objects relate to which version of a parent (as GetDate() may return slightly different values between updates), so this solution is best used only for auditing and simple version tracking, but is very useful at reducing code complexity and the need for stored procedures for insert/update/deletions. I suppose you could alter the update trigger for child tables to read the version from the parent element, but this could cause problems if you modify a child on its own, so for your requirements I suspect the stored proc would be best.

Cheers, Jason

confused

Posts: 1251
Joined: 10-Mar-2006
# Posted on: 14-Mar-2006 20:17:51   

Based on some other threads I have been reading on a related topic:

How about if you serialize the object to memory, then read it back into a new object? You could then set IsNew() on all the objects recursively.

Wayne

Carl
User
Posts: 7
Joined: 10-Mar-2006
# Posted on: 15-Mar-2006 14:35:17   

Hi Wayne

The serialise/deserialise kind of worked as did some of the other stuff that I tried such as reflection. However the problem here is still prefetch/lazy loading - the serialise only serialises what is loaded.

So:

MyEntity entity = new MyEntity(entityID);

doesn't do enough if we have (and we do have) 1:n or n:m relationships.

I suppose a 'PrefetchAll' would be useful here. I have a colleague in another room who is having to write the prefetch code that is specific to each Entity so that we can duplicate entities and serialise to XML. It isn't rocket science but it is not what we wanted to do in our OO scheme of things, it would be nice if either the 'PrefetchAll' or the kind of clone/copy that we want was built into the entity base class. Having to write SQL etc. seems to go against the the whole point of using ORM.

OK - economics has taken priority over building a perfect world, I guess we have admitted defeat and we are kludging together a solution, some people would say "that is real life".

Thanks to all of you for your suggestions and support.

Carl.

WayneBrantley wrote:

Based on some other threads I have been reading on a related topic:

How about if you serialize the object to memory, then read it back into a new object? You could then set IsNew() on all the objects recursively.

Wayne

Posts: 1251
Joined: 10-Mar-2006
# Posted on: 15-Mar-2006 15:32:37   

I know you are defeated but, did you try this?

Given an EntityObject, using reflection, look at all the properties. Gather the ones that are of type IPrefetchPath and then add those to a PrefetchPath to do the full load?

I would think that would work, which ends up with a generic method you could have included into all entities by the generator? Also, of course you could probably extend the templates and such with something that looked through the prefetch items at build time and made this method instead of using reflection.

After all of that you could finally add a method that copied an object with serialize/deserialize.

Just a thought....

Carl
User
Posts: 7
Joined: 10-Mar-2006
# Posted on: 15-Mar-2006 15:58:11   

Hi Wayne

Just when I was starting to relax (== defeated wink ), believing that it was going to be a problem for the other guy working on this frowning

I like what you are suggesting because I really don't like the idea of hard coding the prefetch paths as they will need maintaining/adding every time that someone changes of adds to the database.

Ok - I will give it a go and post my finding up here - although it may go quiet for a few days as I have another piece of work to do first.

Thanks

Carl.

WayneBrantley wrote:

I know you are defeated but, did you try this?

Given an EntityObject, using reflection, look at all the properties. Gather the ones that are of type IPrefetchPath and then add those to a PrefetchPath to do the full load?

I would think that would work, which ends up with a generic method you could have included into all entities by the generator? Also, of course you could probably extend the templates and such with something that looked through the prefetch items at build time and made this method instead of using reflection.

After all of that you could finally add a method that copied an object with serialize/deserialize.

Just a thought....

Carl
User
Posts: 7
Joined: 10-Mar-2006
# Posted on: 06-Apr-2006 15:57:18   

Here is how it worked in the end...

We added a method to the entity that supplied a suitable prefetch - I still like Wayne's idea of using reflection but we didn't have enough time.

We created a custom attribute that we tagged onto every (related) entity that we did not want to duplicate, for example we have a country entity that is related to an address entity and whilst we do want to duplicate the address we don't want to duplicate the country.

We created a new top level entity (using the prefetch method) by serialising and deserialising it to memory - this seem to sucessfully 'divorce' its references to existing database records.

We then recursively visit all related entities available through the GetMemberEntityCollections() & GetDependentRelatedEntities(), there are quite a few 'circular' references here so we record the object ids and only process/recurse if it was an object that was previously unrecorded. The processing is just setting IsDirty/IsNew at entity level and IsChanged for non-null fields at field level.

This get us a new duplicate of an existing entity - the custom attribute allows us to specify which parts we do not wish to duplicate and a bit of processing higher up in the business layer does things like resetting the creation date etc.

I hope that this gives some ideas to fellow developers if faced with a similar challenge - I still think it would be handy to have this kind of functionality 'in-the-box' but I can see that it would have to be justified by demand. Finally - thank you to everyone who helped nudge me though this challenge.

Carl wrote:

Hi Wayne

Just when I was starting to relax (== defeated wink ), believing that it was going to be a problem for the other guy working on this frowning

I like what you are suggesting because I really don't like the idea of hard coding the prefetch paths as they will need maintaining/adding every time that someone changes of adds to the database.

Ok - I will give it a go and post my finding up here - although it may go quiet for a few days as I have another piece of work to do first.

Thanks

Carl.

WayneBrantley wrote:

I know you are defeated but, did you try this?

Given an EntityObject, using reflection, look at all the properties. Gather the ones that are of type IPrefetchPath and then add those to a PrefetchPath to do the full load?

I would think that would work, which ends up with a generic method you could have included into all entities by the generator? Also, of course you could probably extend the templates and such with something that looked through the prefetch items at build time and made this method instead of using reflection.

After all of that you could finally add a method that copied an object with serialize/deserialize.

Just a thought....

swasical
User
Posts: 23
Joined: 03-Oct-2012
# Posted on: 28-Jan-2013 19:55:58   

Ouch. I agree that this feels like a lot of work: I also feel LLBLGen needs a "copy constructor" of some kind.

It would be great if you could add a few code snippets for the LLBLGen newbies: e.g., We created a new top level entity (using the prefetch method) by serialising and deserialising it to memory.

Cheers!

Walaa avatar
Walaa
Support Team
Posts: 14946
Joined: 21-Aug-2005
# Posted on: 28-Jan-2013 21:25:04   

As per the forum guidelines, could you please create a new question? I'll have to close this question, waiting for your new thread.

Thanks,