Poor Microsoft Access Performance???

Posts   
 
    
SolidPig
User
Posts: 9
Joined: 16-Nov-2004
# Posted on: 27-Jul-2005 11:18:32   

Hi guys, I'm finding serious performance problems trying to run our application against the MS Access drivers.

Our application isn't excatly DB intensive and only reads a dozen or so rows per query. We had began development using MSDE and the DB reads were instantaneous but my boss insists on using MS Access "for historical reasons". We regenerated the Data Access layer for MS Access but the performance is terrible - and seems to be inconsistant -some DB reads are instantenous whilst others take over 30 seconds.

Does anyone have any tips or tricks to improve performance?

I'm starting to think that there's something really simple that i'm missing here?

Ta

SP.

Marcus avatar
Marcus
User
Posts: 747
Joined: 23-Apr-2004
# Posted on: 27-Jul-2005 11:58:49   

From my MS Access days... simple_smile There is a couple of big differences between MSDN and Access from the performance point of view.

I'm not sure how the LLBLGen driver works internally... but my memory of access is that queries are performed client side (not server side) it doesn't matter how many rows are actually returned from the query... the entire table (or tables) must be read over the network into the client in order for the query to process. How big and how many tables are involved in your queries?

Throw in a little concurrency and locks can cause more delays again...

Your boss better have really good historical reasons for this kind of decision frowning

Otis avatar
Otis
LLBLGen Pro Team
Posts: 39588
Joined: 17-Aug-2003
# Posted on: 27-Jul-2005 12:13:14   

As Marcus explains, most slowdowns on access are caused by large pulls of data over the network and multiple -users fiddling with the same data, which cause extensive locks. The system access uses isn't really ment for multi-user environments (if I'm not mistaken, it uses table locks, not rowlocks).

The code uses teh same constructs on access as it does on sqlserver or oracle, so it's not as if the code is slower on access just because it has to do a lot of different things. What you should look into though is how you're reading the data, e.g.: with lazy loading or with prefetch paths, are the reads from tables with blobs?.

When you switch on tracing for the Access DQE, you can see the queries executed, which might give you better insight in what's going on.

Frans Bouma | Lead developer LLBLGen Pro
SolidPig
User
Posts: 9
Joined: 16-Nov-2004
# Posted on: 27-Jul-2005 12:14:28   

How big and how many tables are involved in your queries?

Our tables typically have a few dozen rows each and our most complex query is a three table join. It really is very simple.

Throw in a little concurrency and locks can cause more delays again...

Your boss better have really good historical reasons for this kind of decision frowning

I'm trying to convice him to move from a VB6 ADO 'architecture' to VB.NET and MSDE. We've rewritten the admin module in VB.NET but it's still using MS Access until we rewrite the main application Client and Server modules.

Given the performance we're seeing now however my boss is very skeptical about moving forward at all.

SP.

Otis avatar
Otis
LLBLGen Pro Team
Posts: 39588
Joined: 17-Aug-2003
# Posted on: 27-Jul-2005 12:59:07   

If it's that small, it should be instantanious. My fetch unittests on access' northwind are on par with the sqlserver's.

So, please post some code which is slow. It's key to post complete methods, or at least, if you're using adapter, from the line where you create the adapter object.

Frans Bouma | Lead developer LLBLGen Pro
SolidPig
User
Posts: 9
Joined: 16-Nov-2004
# Posted on: 29-Sep-2005 16:16:16   

Hi, apologies for the delay in posting this reply - we've been firefighting the past few weeks - but now that's over performance has been pushed back to the top of my boss's priority list.

As you can see from the methods below we typically have one parent business object consisting of two entities with several lists of child business objects hanging off it. Most of these child object map to single entities.

I'm slightly confused as to wither we should be passing a single data adapter as a parameter to all methods called from the Load() method - or is it fine to just declare a new adapter in each FetchXXXList() method ?

I realise now from reading the forum further that we should be using prefetch paths to improve perfomance - but as it stands - this code is running instantaneously on SQL sever - but helluva slow (10 - 120 seconds delay) on MS Access.

We're unfortunately stuck using MS Access just now - our boss is a self taught VB6 programmer who's been using Access for years. He's very sceptical about moving to .NET and SQL server and doesn't understand why our .NET app is running slower than his VB6 ADO app. Unless we can demonstrate like for like performance he is likely to tell us to bin the entire .net migration and LLBLGEN

Based on our code below, what would be your suggested first step to improving performance?

    /// <summary>
    /// Load as part of a larger transaction
    /// </summary>
    /// <param name="objDataAccessAdapter"></param>
    /// <param name="intID"></param>
    public override void Load(DataAccessAdapter objDataAccessAdapter, int intID)
    {
        IHasSerialPorts  iSerialDevice;
        IVideoConnections   iVideoDevice;
        IBinaryIO          iBinaryIODevice;

        try
        {
            // All codecs are stored in the DB as a combination of a row in tblHardware and a row in tblCodec
            // Load Hardware entity
            base.Load(objDataAccessAdapter, intID); 

            // Load the Codec Entity
            m_CodecEntity = new CodecEntity(intID);         
            objDataAccessAdapter.FetchEntity(m_CodecEntity);

            if (m_CodecEntity.IsNew == false)
            {
                // Set the IP address from the Codec Entity
                this.SetIPAddress(m_CodecEntity.IP);

                // Set the Protocol from the Codec Entity
                this.m_eProtocol = (enumCodecProtocol)m_CodecEntity.Protocol;

                // Set the Resolution from the Codec Entity
                this.m_eResolution = (enumCodecResolution)m_CodecEntity.Resolution;

                // Load the Serial Port List
                iSerialDevice = (IHasSerialPorts)this;
                iSerialDevice.FetchSerialPortList(this);

                // Load the Video Connections List
                iVideoDevice = (IVideoConnections)this;
                iVideoDevice.FetchVideoRouteConnectionsList(this);

                // Load the Binary IO Pins List
                iBinaryIODevice = (IBinaryIO)this;
                iBinaryIODevice.FetchBinaryIOPinList(this);
            }
            else
            {
                Exception expBaseCodecLoad = new Exception("BaseCodec.Load() - Could not load Codec with ID=" + intID);
                throw expBaseCodecLoad;         
            }
        }
        catch (Exception expCaught)
        {
            Exception expBaseCodecLoad = new Exception("Exception caught in BaseCodec.Load", expCaught);
            throw expBaseCodecLoad;         
        }
    } // End BaseCodec::Load()      


    /// <summary>
    /// Retrieve a list of all serial ports which this device controls
    /// </summary>
    public void FetchSerialPortList(BaseDevice objThisDevice)
    {
        DataAccessAdapter      objDataAcessAdapter;
        EntityCollection        collSerialPorts;
        RelationPredicateBucket objWhereHardwareIDClause;
        ISortExpression      objOrderByClause; 
        int                  intSerialPortLoop;
        SerialPort            objNewSerialPortBusObj;
        SerialPortEntity        objSerialPortEntity;

        // Remove all SerialPorts currently in the list
        m_SerialPortList.Clear();

        // Create a DataAcessAdapter
        objDataAcessAdapter   = new DataAccessAdapter(DBConnectionStrings.GetDevConfigDBConnectionString());

        // Create the WHERE tblDeviceType.Device_Type_name = this.device_type_name
        objWhereHardwareIDClause                        = new RelationPredicateBucket();                                                    
        FieldCompareValuePredicate objHardwareIDFilter  = PredicateFactory.CompareValue(
                                                            SerialPortFieldIndex.Hardware_ID,
                                                            ComparisonOperator.Equal, 
                                                            objThisDevice.GetID());
        objWhereHardwareIDClause.PredicateExpression.Add(objHardwareIDFilter);


        objOrderByClause = new SortExpression(SortClauseFactory.Create(SerialPortFieldIndex.Port_number, 
                                              SortOperator.Ascending));

        // Declare a collection of DeviceModel entities
        collSerialPorts = new EntityCollection(new SerialPortEntityFactory());  

        /// SELECT * FROM tblSerialPort WHERE Parentnt_UID = this.ID ORDER BY SerialPortNo
        objDataAcessAdapter.FetchEntityCollection(collSerialPorts, objWhereHardwareIDClause, 0, objOrderByClause);          


        // Now iterate through the collection and construct a SerialPort business object for each SerialPortEntity
        for (intSerialPortLoop = 0; intSerialPortLoop < collSerialPorts.Count; intSerialPortLoop++)
        {
            // Create a SerialPort business object from the SerialPortEntity            
            objSerialPortEntity = (SerialPortEntity)collSerialPorts[intSerialPortLoop];
            objNewSerialPortBusObj = new SerialPort(objSerialPortEntity);

            // Add it to the SerialPortList
            AddSerialPort(objThisDevice, objNewSerialPortBusObj);

        } // Endfor SerialPortLoop

    } // End FetchSerialPortList()

Thanks

SolidPig

Walaa avatar
Walaa
Support Team
Posts: 14946
Joined: 21-Aug-2005
# Posted on: 30-Sep-2005 16:02:46   

Reading your first comment in the code: "/// Load as part of a larger transaction"

The following are from the LLBLGen Pro documentation "Using the generated code -> Adapter -> DataAccessAdapter Functionality" Connection control It can be useful to open a connection and keep it open for multiple actions and then close it. This can give extra performance, especially in code where multiple database fetches are used in one routine. The property KeepConnectionOpen is used to set this behaviour.

So answering your question: "I'm slightly confused as to whether we should be passing a single data adapter as a parameter to all methods called from the Load() method - or is it fine to just declare a new adapter in each FetchXXXList() method ?"

I think it's far better to pass the same adapter to all the FetchXXXList() methods.

Furthermore you should be keeping the connection open through all those calls, and close it at the end. This is the major trick.

wayne avatar
wayne
User
Posts: 611
Joined: 07-Apr-2004
# Posted on: 30-Sep-2005 17:57:13   

You must remember that Access is a desktop database. it will not perform as good as other REAL SQL databases and will be really pathetic over a LAN.

The reason for is because its SQL engine sits on the client.

I few years ago i worked with Delphi and Paradox 7. Paradox is a ISAM DB which is a desktop database simular to Access and allthough the BDE (Database Engine) gave us network ability it was very, very sloooow - because the SQL Engine was sitting client side. If your select condition should only bring back 5 records out a possible 50 000 - The SQL Engine would have had to pull all 50 000 records over the lan to be able to run the SQL and give you your 5 rows. Now imagine you had joins ontop of that!

Just thought that i will give you little info as i have no idea how to help you otherwise. stuck_out_tongue_winking_eye

Otis avatar
Otis
LLBLGen Pro Team
Posts: 39588
Joined: 17-Aug-2003
# Posted on: 03-Oct-2005 11:54:51   

As Walaa explained: major issue is the connection stuff: - open the connection manually by calling adapter.OpenConnection() - be sure to specify that the connection has to be kept open - be sure to use the latest runtimes as there was a bug where in some scenario's the connection was closed even it should be kept open - use one adapter for a serie of fetches, so you can utilize an already open connection.

And... filter early, which means: better specify an additional filter in your query than to manually filter afterwards.

Frans Bouma | Lead developer LLBLGen Pro
SolidPig
User
Posts: 9
Joined: 16-Nov-2004
# Posted on: 04-Oct-2005 13:36:03   

Thanks guys

Ghostlight
User
Posts: 2
Joined: 04-Oct-2005
# Posted on: 04-Oct-2005 17:22:18   

use one adapter for a serie of fetches, so you can utilize an already open connection.

Is it viable to take this to the extreme in a single thread and use a single global adapter whose connection is open once when the app first runs and only closes when the app exits. This adapter would be passed into every db-related method as a parameter. Viable? If not, why not?

Thanks!

Otis avatar
Otis
LLBLGen Pro Team
Posts: 39588
Joined: 17-Aug-2003
# Posted on: 04-Oct-2005 18:09:24   

Ghostlight wrote:

use one adapter for a serie of fetches, so you can utilize an already open connection.

Is it viable to take this to the extreme in a single thread and use a single global adapter whose connection is open once when the app first runs and only closes when the app exits. This adapter would be passed into every db-related method as a parameter. Viable? If not, why not? Thanks!

In a desktop app, this is perfectly fine. In a webapp, you'll keep a connection open for every site user, which can bring down the system.

However, most problems with connections and access are connection pooling related.

Frans Bouma | Lead developer LLBLGen Pro