Common predicate for all entities

Posts   
 
    
nabils
User
Posts: 46
Joined: 30-Nov-2008
# Posted on: 16-Feb-2009 13:17:22   

All my entities have a field called status which is the same type without exception. I am trying to apply a default predicate to be applied whenever I get anything from the db e.g. WHERE status = 1

What is the best way to do this?

Thanks

Walaa avatar
Walaa
Support Team
Posts: 14950
Joined: 21-Aug-2005
# Posted on: 16-Feb-2009 14:07:58   

Would this work?

public PredicateExpression GetMyDefaultPredicate(EntityBase entity)
{
 return new PredicateExpression(entity.Fields["Status"] == 1)
}
nabils
User
Posts: 46
Joined: 30-Nov-2008
# Posted on: 16-Feb-2009 14:44:09   

Maybe I didn't explain correctly. I need it to be applied automatically to every table at the lowest level.

I tried creating a custom dataaccessadapter and overrided FetchEntity, FetchTypedList, FetchNewEntity, and FetchEntityCollection and then found that using LINQ queries to bring down a query with a number of joins calls into FetchDataReader and the others were ignored.

However FetchDataReader has already constructed the sql. Basically in a query such as the below:

SELECT   p.firstName, p.lastName AS a, Address.buildingName, Address.address1
FROM         Person AS p INNER JOIN
                      PersonAddress AS a ON a.personId = p.personId INNER JOIN
                      Address ON a.addressId = Address.addressId

I need to automatically WHERE clauses such as:

SELECT   Person.firstName, Person.lastName, Address.buildingName, Address.address1, Address.validStatusId
FROM         Person INNER JOIN
                      PersonAddress ON a.personId = p.personId INNER JOIN
                      Address ON a.addressId = Address.addressId
WHERE    (Person.status = 1) AND (Address.status = 1) AND (PersonAddress.status = 1)
Walaa avatar
Walaa
Support Team
Posts: 14950
Joined: 21-Aug-2005
# Posted on: 16-Feb-2009 15:01:55   
nabils
User
Posts: 46
Joined: 30-Nov-2008
# Posted on: 16-Feb-2009 16:23:50   

Thanks that helps. Not too easy though. It will take some time to understand properly. I tried modifying the code. Please see below. The FieldCompareValuePredicate does not compile because f is not an IEntityFieldCore. How would I resolve this?

            List<string> tables = new List<string>();

            for (int i = 0; i < selectList.Length; i++)
            {
                //only do each table once
                if (!tables.Contains(selectList[i].ContainingObjectName))
                {
                    tables.Add(selectList[i].ContainingObjectName);
                    foreach (IEntityFieldCore f in EntityFieldsFactory.CreateFields(selectList[i].ContainingObjectName))
                    {
                         if (f.Name.ToLower() == "status")
                        {
                            IPredicate ip = new FieldCompareValuePredicate(f,  ComparisonOperator.GreaterThan, 0);
                            if (selectFilter == null)
                            {
                                selectFilter = ip;
                            }
                            else
                            {
                                selectFilter = new PredicateExpression(selectFilter, PredicateExpressionOperator.And, ip);
                            }
                        }
                    }
                }
            }
nabils
User
Posts: 46
Joined: 30-Nov-2008
# Posted on: 16-Feb-2009 16:28:44   

By the way forgot to say I am using the adapter method. In case it makes any difference. Thanks

daelmo avatar
daelmo
Support Team
Posts: 8245
Joined: 28-Nov-2005
# Posted on: 17-Feb-2009 04:28:55   

Change this

 foreach (IEntityFieldCore f in EntityFieldsFactory.CreateFields(selectList[i].ContainingObjectName))

into this

 foreach (IEntityField2 f in EntityFieldsFactory.CreateFields(selectList[i].ContainingObjectName))

And this

IPredicate ip = new FieldCompareValuePredicate(f, ComparisonOperator.GreaterThan, 0);

into this

IPredicate ip = new FieldCompareValuePredicate(f, null,  ComparisonOperator.GreaterThan, 0);

BTW, what LLBLGen version are you using?

HTH wink

David Elizondo | LLBLGen Support Team
nabils
User
Posts: 46
Joined: 30-Nov-2008
# Posted on: 17-Feb-2009 12:45:11   

I am using 2.6 Final and SQL Server 2005. Finding a solution to this is extremely important to me. It is currently the only stumbling block with llblgen in the project. I have looked around all over the forums. All I could find was self-servicing examples. The solutions I have tried are:

1) Create a partial class of the DataAccessAdapter and override CreateSelectDQ containing the code above with the changes you mentioned below.

I got a NullReferenceException in IEntityFieldCore f in EntityFieldsFactory.CreateFields(fieldsToFetch[i].ContainingObjectName)). This is because fieldsToFetch[i].ContainingObjectName is an empty string for a number of the fields.

2) I also tried extending the dataaccessadapter which didn't seem to work.

I feel I am stabbing in the dark because of my lack of understanding on the internals of llblgen. Do you have any example code that would solve my problem (using adapter mode) or at least give me a good idea and put me on the right track?

Just to make it really clear I just need to set a filter on every select made by llblgen at every level (derived tables) to filter my status field. This status field exists on every table in the system (we use it as a soft delete mechanism).

Thanks in advance

MTrinder
User
Posts: 1461
Joined: 08-Oct-2008
# Posted on: 17-Feb-2009 22:09:44   

In item 1, which fields returned an empty string for ContainingObjectName - could you check these and skip over them...?

How much control do you have over structure/design of the database - there are other approaches that you could investigate

b) construct a view per table to perform the fileter for you - entities can be mapped onto views as well as tables which makes the solution much more transparent to your application.

a) set triggers on the tables to move deleted entities to archive tables in the same/another database

Matt

nabils
User
Posts: 46
Joined: 30-Nov-2008
# Posted on: 17-Feb-2009 22:13:32   

OK I tried the following. I had to put GetFieldPersistenceInfo(f) into the predicate otherwise I got a null refrence exception. It seemed to get to the query but then bomb out with Msg 4104, Level 16, State 1, Line 1 The multi-part identifier "mydb.dbo.Assistant.status" could not be bound. Msg 4104, Level 16, State 1, Line 1 The multi-part identifier "mydb.dbo.Person.status" could not be bound.

Debugging the code I came across the follwoing error when trying to view any entityfield: InvalidOperationException {"This object was constructed using a non-selfservicing constructor. Can't retrieve an IEntityField after that."} System.SystemException {System.InvalidOperationException}

   partial class DataAccessAdapter
    {
        protected override IRetrievalQuery CreateSelectDQ(IEntityFields2 fieldsToFetch, IFieldPersistenceInfo[] persistenceInfoObjects, IPredicateExpression filter, long maxNumberOfItemsToReturn, ISortExpression sortClauses, IRelationCollection relationsToWalk, bool allowDuplicates, IGroupByCollection groupByClause, int pageNumber, int pageSize)
        {
            List<string> tables = new List<string>();

            for (int i = 0; i < fieldsToFetch.Count; i++)
            {
                //only do each table once
                if (!tables.Contains(fieldsToFetch[i].ContainingObjectName))
                {
                    if (fieldsToFetch[i].ContainingObjectName != String.Empty)
                    {
                        tables.Add(fieldsToFetch[i].ContainingObjectName);
                        foreach (IEntityField2 f in EntityFieldsFactory.CreateFields(fieldsToFetch[i].ContainingObjectName))
                        {
                            if (f.Name.ToLower() == "status")
                            {
                                IPredicate ip = new FieldCompareValuePredicate(f, GetFieldPersistenceInfo(f), ComparisonOperator.Equal, ValidStatus.Active);
                                if (filter == null)
                                {
                                    filter.Clear();
                                    filter.Add(ip);
                                }
                                else
                                {
                                    filter = new PredicateExpression(filter, PredicateExpressionOperator.And, ip);
                                }
                            }
                        }
                    }
                }
            }
            return base.CreateSelectDQ(fieldsToFetch, persistenceInfoObjects, filter, maxNumberOfItemsToReturn, sortClauses, relationsToWalk, allowDuplicates, groupByClause, pageNumber, pageSize);
        }
    }
nabils
User
Posts: 46
Joined: 30-Nov-2008
# Posted on: 17-Feb-2009 22:19:38   

Matt,

I tried skipping the empty containingobjectname. See above.

Unfortunately not much. The database consists of around 350+ tables and every table has to handle soft deletes in this way. It would just be too much maintenance and overhead.

Is it not possible in llblgen to do what I need to do?

Thanks

MTrinder
User
Posts: 1461
Joined: 08-Oct-2008
# Posted on: 17-Feb-2009 22:32:25   

if you use Query Profiler can you see the query hitting the db ? If so, can you post the query so that we can take a look at it...?

the "could not be bound" error in sql generally means that it is trying to query a field that does not exist on the table - which obviously shouldn't be the case...

Matt

Otis avatar
Otis
LLBLGen Pro Team
Posts: 39617
Joined: 17-Aug-2003
# Posted on: 18-Feb-2009 10:23:16   

Soft Deletes should be done with delete triggers and an archive database. The problem with soft deletes is that the overall database data set grows and grows while the active working set (the rows which aren't 'deleted') becomes smaller and smaller compared to the full data set, so over time the database performance degrades, probably significantly. Add to that the increased complexity of the code, (which results in higher maintenance costs), a different solution is often preferable.

A more cleaner solution is to use an archive catalog, which is similar to the one the data is contained in, and a set of delete triggers. The code in the application can now be written normally, and the delete triggers insert the deleted row in the archive catalog's tables. As data which is 'deleted' is only accessed in relatively rare cases (why otherwise access deleted data), that specific code can be written against the archive database and simply migrate the data back to the main database.

netto result: higher performance, cleaner simpler code and the deleted data is still available for recovery/auditing.

Please consider the above when implementing soft deletes. The core reason why soft deletes are required is in almost all cases the ability to look at older data and that's perfectly possible with an archive catalog. Reverting back deleted data is less common as it's very cumbersome to implement: reverting on 1 table is doable, however tables are related to eachother, so rolling back 1 row has implications to its related rows as well, so what you want in these cases is rolling back a subgraph. This is very complex. Soft deletes won't help you there as well, as the complexity of the problem to roll back the data is the same.

Requiring a filter on the tables in every query is very complex to build in, because every join requires the filter in its ON clause as well. It's not something that's easily implemented and works for 100% of the time. Additionally, using soft-deletes has the drawback that you can't use unique constraints nor foreign key constraints, as they might clash with rows which are 'deleted' but still there.

As your model is rather big, and you need to implement soft deletes on all tables (which I doubt is necessary btw, who decided that and why?) the amount of data your application has to wade through every day is IMHO increasing rapidly and will hurt performance pretty badly in the long run. So using an archive catalog with delete triggers is much simpler, cleaner and has the same effect: no data is lost after deletes, as the triggers always fire.

Frans Bouma | Lead developer LLBLGen Pro
nabils
User
Posts: 46
Joined: 30-Nov-2008
# Posted on: 18-Feb-2009 11:13:30   

Thanks Frans. Your advice is well noted and makes perfect sense. I'll try to find a way round.