HELP!! General advice for performance problems...

Posts   
 
    
Posts: 20
Joined: 03-Sep-2007
# Posted on: 03-Sep-2007 14:46:23   

Hi

We have been developing an application using Llblgen for the past 12 months. The initial release of the software was to a customer with very small amounts of data.

However, we now need to scale up the data and have hit some major bottlenecks and deadlocks, and I wonder if anyone would have any advice on how we should be doing things.

My main problems are:

  1. When the system starts I need to update a number of records (currently 1000, but will be much more). This takes 25 seconds, where I would expect it to take nearer 1 second!!

  2. The application is multi threaded. Some of the threads are keeping the system and database data synchronised with data coming in from some hardware (so the synchronisation is running in a loop). The application then does the usual fetch, update, delete type of stuff to the data. The problem is that we frequently get deadlocks.

  3. (might be a continuation of 2) I have lots of code which only reads data. What is the best way to do this with a DataAccessAdapter?

Point 1, this is my code:

DataAccessAdapter adapter = NewAdapter; // I explain further down what NewAdapter does. HandsetEntity hsEntity;

foreach (Handset hs in Handsets) { hsEntity = new HandsetEntity(hs.Uid); if (adapter.FetchEntity(hsEntity)) { if (hs.CurrentSmartcard != null) hsEntity.SmartcardId = hs.CurrentSmartcard.SmartcardId; else hsEntity.SmartcardId = null; adapter.SaveEntity(hsEntity, false, false); } }

If I comment out the SaveEntity, the code takes 0.5 secs instead of 25. Also, I have tried adding all the entities to an EntityCollection, then calling SaveEntityCollection (instead of individually saving each entity), but that takes just as long.

Point 2: I have 2 methods to create DataAccessAdapters (code below). (Is this a good or a bad idea?) What would be a better way of implementing this so as to avoid locks? I have tried using a global adapter for each thread but the deadlocks were almost constant. I then tried creating and freeing one in each method, but I still had deadlocks. So the best results I have had so far have been using NewReadAdapter when I know I will only be reading data, but as I say I still get deadlocks.

The code for the methods are as follows:

    /// <summary>
    /// Returns a new DataAccessAdapter which is ready to use.
    /// </summary>
    internal DataAccessAdapter NewAdapter
    {
        get { return GetNewAdapter(false); }
    }

    /// <summary>
    /// Returns a new Read Only DataAccessAdapter which is ready to use.
    /// </summary>
    internal DataAccessAdapter NewReadAdapter
    {
        get { return GetNewAdapter(true); }
    }

    /// <summary>
    /// Returns a new DataAccessAdapter which is ready to use.
    /// </summary>
    /// <param name="readOnly">Determines if the data access will be read only.</param>
    /// <returns></returns>
    private DataAccessAdapter GetNewAdapter(bool readOnly)
    {
        DataAccessAdapter adapter = null;

        try
        {
            adapter = new DataAccessAdapter(ConnectionString);
            adapter.KeepConnectionOpen = true;

            // Open transaction for non-readOnly adapters.
            if (readOnly == false)
                adapter.StartTransaction(System.Data.IsolationLevel.ReadCommitted, _transactionName);
        }
        catch (Exception ex)
        {
                ExceptionManager.HandleException(this, ex);
                throw;
        }

        return adapter;
    }

Any help or advice would be appreciated. Code examples even more appreciated!!

Will

psandler
User
Posts: 540
Joined: 22-Feb-2005
# Posted on: 03-Sep-2007 16:41:15   

buckbuchanan wrote:

foreach (Handset hs in Handsets)
{
  hsEntity = new HandsetEntity(hs.Uid);
  if (adapter.FetchEntity(hsEntity))
  {
    if (hs.CurrentSmartcard != null)
      hsEntity.SmartcardId = hs.CurrentSmartcard.SmartcardId;
    else
      hsEntity.SmartcardId = null;
    adapter.SaveEntity(hsEntity, false, false);
  }
}

I'm not completely sure I'm understanding correctly, but I think you could rewrite this code to use adapter.UpdateEntitiesDirectly. Looks like you might have to do two update statements and not one.

This thread talks about using UpdateEntitiesDirectly using a field from a related table: http://www.llblgen.com/tinyforum/Messages.aspx?ThreadID=9574&HighLight=1

I'm not sure this is good advice, though, because it looks like hs is an object you only have in memory? In that case, you obviously have to set the values in the application as you are currently doing.

buckbuchanan wrote:

        /// <summary>
        /// Returns a new DataAccessAdapter which is ready to use.
        /// </summary>
        /// <param name="readOnly">Determines if the data access will be read only.</param>
        /// <returns></returns>
        private DataAccessAdapter GetNewAdapter(bool readOnly)
        {
            DataAccessAdapter adapter = null;

            try
            {
                adapter = new DataAccessAdapter(ConnectionString);
                adapter.KeepConnectionOpen = true;

                // Open transaction for non-readOnly adapters.
                if (readOnly == false)
                    adapter.StartTransaction(System.Data.IsolationLevel.ReadCommitted, _transactionName);
            }
            catch (Exception ex)
            {
                    ExceptionManager.HandleException(this, ex);
                    throw;
            }

            return adapter;
        }

I think this might be the root of you problem (again, if I am understanding correctly). If you are using this overload to commit the update code above, your code is structured to:

  1. Start a transaction
  2. Fetch an entity
  3. Modify an entity
  4. Save an entity
  5. Repeat 2-4 until all entities are processed
  6. Commit the transaction

You are doing a database operations inside a transaction that really doesn't require a transaction (like fetching). In addition, you are holding that transaction open while you are doing application actions that have nothing to do with the database (like looping).

As a guiding principle (there are always exceptions, of course), you don't want to read records within a transaction--reserve transactions for writing data. Also, you want to start a transaction as late as possible and commit it as early as possible.

So try to restructure your code so that you:

  1. Fetch and modify your entities
  2. Start your transaction
  3. Save your entities
  4. Commit your transaction

I would use an EntityCollection save, as it prevents you from having to loop through a bunch of single entity saves (might not make a performance difference, but it's less code).

HTH,

Phil

Posts: 20
Joined: 03-Sep-2007
# Posted on: 03-Sep-2007 17:40:49   

Many thanks for the reply Phil.

I think you have made some very good points there and I will make the changes required. However, just to clarify the situation.....

  1. I don't think UpdateEntitiesDirectly is what I need. I am updating the entities with data from a hardware object and not from another table.
  2. I have tried an EntityCollection save and as you say it made no difference.

I do think you may have something with your advice on the timing of my transaction which is something I am going to look at (although it is off course more difficult than in my simplified example!!).

Thanks again for the advice.

I have ran the update using an sql command and it took less than 1 second... this is why I am getting puzzled.

goose avatar
goose
User
Posts: 392
Joined: 06-Aug-2007
# Posted on: 03-Sep-2007 17:58:42   

buckbuchanan wrote:

Many thanks for the reply Phil.

  1. I don't think UpdateEntitiesDirectly is what I need. I am updating the entities with data from a hardware object and not from another table. ... I have ran the update using an sql command and it took less than 1 second... this is why I am getting puzzled.

I think the UpdateEntitiesDirectly indeed will help you here as it would make one Update SQL statement instead of 1000 (or more) times. How is the update query you are running on sql command?

Posts: 20
Joined: 03-Sep-2007
# Posted on: 03-Sep-2007 18:11:02   

Hi Goose - thanks for the reply. This is interesting and shows that I misunderstood how UpdateEntitiesDirectly works.... From what I could make out it updates the records from linked tables automatically (obviously wrong). How would I use it in my case - I don't really understand how I would code it. I am doing a loop foreach (Handset hs in Handsets) - Handsets is a List<Handset> which is a hardware object.

Doing it using Sql command I used the following code:

         IDbConnection conn = new SqlConnection("Data Source=.;Initial Catalog=HostDB;Integrated Security=True");
        conn.Open();
        IDbCommand cmd = conn.CreateCommand();
        cmd.CommandText = "UPDATE [device].[Handset] SET [SmartcardId]=@SmartcardId WHERE ( [device].[Handset].[HandsetId] = @HandsetId)";
        IDbDataParameter param = cmd.CreateParameter();
        param.DbType = DbType.Int32;
        param.ParameterName = "@HandsetId";
        param.SourceColumn = "HandsetId";
        cmd.Parameters.Add(param);
        param = cmd.CreateParameter();
        param.DbType = DbType.Int32;
        param.ParameterName = "@SmartcardId";
        param.SourceColumn = "SmartcardId";
        cmd.Parameters.Add(param);

        foreach (Handset hs in hsc.Handsets)
            if (hs.CurrentSmartcard != null)
            {
                    ((IDbDataParameter)cmd.Parameters["@HandsetId"]).Value = hs.Uid;
                    if (hs.CurrentSmartcard != null)
                        ((IDbDataParameter)cmd.Parameters["@SmartcardId"]).Value = null;
                    else
                    ((IDbDataParameter)cmd.Parameters["@SmartcardId"]).Value = hs.CurrentSmartcard.SmartcardId;
                    cmd.ExecuteNonQuery();
            }
Rogelio
User
Posts: 221
Joined: 29-Mar-2005
# Posted on: 04-Sep-2007 03:21:42   

Hi,

You have multiple threads and you are getting deadlock, try sorting the hs in Handsets by hs.Uid before the for each ...

Use readCommitted transactions and be sure that the SQL server's default isolation level is ReadCommited.

Hope this help.

Posts: 20
Joined: 03-Sep-2007
# Posted on: 04-Sep-2007 11:42:19   

Rogelio wrote:

Hi,

You have multiple threads and you are getting deadlock, try sorting the hs in Handsets by hs.Uid before the for each ...

Use readCommitted transactions and be sure that the SQL server's default isolation level is ReadCommited.

Hope this help.

Hi Rogelio Thanks for the reply.

Just so that I understand, can you tell me why sorting the handsets would help? The Uid is the primary key (which I am sure is why you suggest I sort on this). My issue is that I could have many thousands of handsets and sorting this list 3 times per second could add some overhead.

Also, how do I set the Default Isolation Level?

Thanks

Walaa avatar
Walaa
Support Team
Posts: 14946
Joined: 21-Aug-2005
# Posted on: 04-Sep-2007 12:44:18   

First of all, where do you commit the transaction and where do you close the connection?

I think you should use the UpdateEntitiesDirectly to set the entities having SmartcardId = null.

Please check the following code (un tested):

HandsetEntity hsEntity;
List<int> hsWithoutSmartcardIds = new List<int>();
EntityCollection2<HandsetEntity > handsetWithSmartCards = new EntityCollection2<HandsetEntity >();

DataAccessAdapter readerAdapter = NewReadAdapter;

foreach (Handset hs in Handsets)
{
    hsEntity = new HandsetEntity(hs.Uid);
    if (readerAdapter.FetchEntity(hsEntity))
    {
        if (hs.CurrentSmartcard != null)
        {
            hsEntity.SmartcardId = hs.CurrentSmartcard.SmartcardId;
            handsetWithSmartCards.Add(hsEntity);
        }
        else
        {
            hsWithoutSmartcardIds.Add(hs.Uid); 
        }   
     }
}
readerAdapter.CloseConnection();

RelationPredicateBucket bucket = new RelationPredicateBucket();
bucket.PredicateExpression.Add(HandsetFields.Uid == hsWithoutSmartcardIds);

HandsetEntity hsWithoutSmartcard = new HandsetEntity();
hsWithoutSmartcard.SmartcardId = null;

try
}
    DataAccessAdapter adapter = NewAdapter;
    adapter.SaveEntityCollection(handsetWithSmartCards);
    adapter.UpdateEntitiesDirectly(hsWithoutSmartcard, bucket);

    adapter.Commit();
}
catch
{
    adapter.Rollback();
    throw;
}
finally
{
    adapter.CloseConnection();
    adapter.Dispose();
}
Posts: 20
Joined: 03-Sep-2007
# Posted on: 04-Sep-2007 12:55:18   

Hi Walaa

Many thanks for taking the time to write such a detailed reply - this is a big help and I can see how that would work.

I am actually committing the transaction in a similar way to your posting - I just hadn't included it. I am not however closing the connection. I thought it would be best to leave the connection open since it is in constant use.

Would you have any other suggestions based on this?

Thanks.

Rogelio
User
Posts: 221
Joined: 29-Mar-2005
# Posted on: 04-Sep-2007 13:35:49   

buckbuchanan wrote:

Hi Rogelio Thanks for the reply.

Just so that I understand, can you tell me why sorting the handsets would help? The Uid is the primary key (which I am sure is why you suggest I sort on this). My issue is that I could have many thousands of handsets and sorting this list 3 times per second could add some overhead.

Also, how do I set the Default Isolation Level?

Thanks

The sorting is a technique to avoid the deadlock.

For example: Thread 1 is processing the UId in this order: 20,25,15,40,50 Thread 2 is processing the UId in this order: 50,15,40,60,35

If thread 1 takes the lock on UId 15 at the same time that thread 2 takes the lock on UId 50, then this will cause a deadlock because by when thread 1 will try to save UId 50 it will be locked by thread 2 and by when thread 2 will try to save UId 15 it will be locked by thread 1, then the deadlock.

Remember the first thread that save a record takes the lock on that record (until the commit or rollback). What happens when the UId are sorted:

Thread 1 is processing the UId in this order: 15,20,25,40,50 Thread 2 is processing the UId in this order: 15,35,40,50,60

The lock on UId 15 will be taken first by any of the thread; then when the other thread try to take the lock, that thread will be on hold until the other thread release the lock and the deadlock will not occur.

The idea with the sort is to be sure that when a thread takes a lock on a record the others records that will be saved in the same transaction are not locked yet by other thread.

Posts: 20
Joined: 03-Sep-2007
# Posted on: 04-Sep-2007 13:50:15   

Hi Rogelio Thanks again for the reply.

Currently the handsets are assigned an index number, and that is the order they are sorted in the list. So I don't think the scenario you posted would be possible (please correct me if I am wrong). E.g. if the list contents are: 15,35,40,50,60 then: Thread 1 processing order will be 15,35,40,50,60. Thread 2 processing order will also be 15,35,40,50,60. This is not a list which changes it's sort order for any reason and the handsets never leave the list - new handsets which come online are added to the end of the list.

Does this sound like the scenario which might cause a deadlock?

Rogelio
User
Posts: 221
Joined: 29-Mar-2005
# Posted on: 04-Sep-2007 14:25:25   

buckbuchanan wrote:

Hi Rogelio Thanks again for the reply.

Currently the handsets are assigned an index number, and that is the order they are sorted in the list. So I don't think the scenario you posted would be possible (please correct me if I am wrong). E.g. if the list contents are: 15,35,40,50,60 then: Thread 1 processing order will be 15,35,40,50,60. Thread 2 processing order will also be 15,35,40,50,60. This is not a list which changes it's sort order for any reason and the handsets never leave the list - new handsets which come online are added to the end of the list.

Does this sound like the scenario which might cause a deadlock?

I would like to know what application is processing the handsets? Is it the same application that the users use? Your best option is to create a windows service application to process the handsets. This service will be monitoring the handsets queue. The SQL transactions are slow when you include a lot of records in it. It is better to process a handsets per transaction, you only need to mark the handset as it has been processed.

Posts: 20
Joined: 03-Sep-2007
# Posted on: 04-Sep-2007 14:41:27   

Hi, and thanks again for the reply.

The application is actually a WinForms application which runs a WSE server application, so it has very little user interaction. The users use another application which uses the WSE server and also accesses the database itself directly.

I am happy to have each handset per transaction except in some cases. For example when the client application starts I need to initialise all the handset records. If there are 10000 handsets then I fear that might be quite intensive to have a transaction per handset. Is this the case or have I misunderstood?

jmeckley
User
Posts: 403
Joined: 05-Jul-2006
# Posted on: 04-Sep-2007 15:30:56   

another thing to check is the db indexes. if the db isn't indexed, or not indexed effectively, then no amount of code tweaks will fix the route cause.

indexing large dbs is an art in itself, which is well beyond my skills. I have found that the following are good practises though. These only apply to MS Sql Server. I haven't used other large scale db's.

PK are automatically indexed. so just apply a PK to the table. FK are not indexed. so for every foreign key apply an index on the foreign key column. If there is a column which is searched on frequently index that coulmn.

I know there are more detailed methodolgies however those are beyond my current skills. sql server also has a built-in index wizard. I looked into this once before, but never fully utilized it. I believe you pass the wizard a series of queries and the tool will dermine the best indexing schema.

Rogelio
User
Posts: 221
Joined: 29-Mar-2005
# Posted on: 04-Sep-2007 17:29:06   

buckbuchanan wrote:

Hi, and thanks again for the reply.

The application is actually a WinForms application which runs a WSE server application, so it has very little user interaction. The users use another application which uses the WSE server and also accesses the database itself directly.

I am happy to have each handset per transaction except in some cases. For example when the client application starts I need to initialise all the handset records. If there are 10000 handsets then I fear that might be quite intensive to have a transaction per handset. Is this the case or have I misunderstood?

As far as I know to process a lot of records (10000) it is faster to commit one record at the time than commit all the records together. Do the test.

Posts: 20
Joined: 03-Sep-2007
# Posted on: 04-Sep-2007 17:30:28   

Hi Rogelio

Thanks.... I will give that a go.

Thanks for your help!