dynamic SQL

Posts   
 
    
paulshealy
User
Posts: 22
Joined: 13-Jun-2005
# Posted on: 27-Sep-2005 21:40:05   

Hi,

I have a question regarding dynamic SQL through LLBLGen. We're generating entities for our database and using them like we're supposed to, but parts of our database - data tables for hotels, for example - can be changed by administrators, so we can't generate entities for them. Is there any way to use dynamic SQL through the generated code? We're using the self-servicing template. We currently use the DAAB, which works, but an integrated solution would be far more elegant.

bclubb
User
Posts: 934
Joined: 12-Feb-2004
# Posted on: 28-Sep-2005 04:31:35   

I don't believe you can do anything like that, here a post talking about a schema that changes at runtime. http://www.llblgen.com/tinyforum/Messages.aspx?ThreadID=3963

Paul.Lewis
User
Posts: 147
Joined: 22-Aug-2005
# Posted on: 28-Sep-2005 04:40:59   

paulshealy wrote:

parts of our database - data tables for hotels, for example - can be changed by administrators, so we can't generate entities for them.

Are administrators modifying the table structure; i.e. adding attributes? Or are administrators modifying the table row data?

If the former, I'd suggest redesigning your database so that you don't need to add attributes to your tables. If the latter, you shouldn't have a problem reading the data with LLBLGen.

I don't think I've given you an adaquate answer just yet ... perhaps if you provide more details we can better answer your question.

Paul

paulshealy
User
Posts: 22
Joined: 13-Jun-2005
# Posted on: 28-Sep-2005 14:17:16   

The administrators are modifying attributes and creating new tables. Each time we add a new client - a new hotel - the code creates a new set of data tables for the hotel's data. Then the administrator configures the hotel to have certain attributes, because each hotel has different data they want to store. In this case I can't use entities to read data because the tables are created at runtime. What I'd really like to be able to do is execute statements against the database the same way the DAAB allows, but though LLBLGen.

MacDennis avatar
MacDennis
User
Posts: 50
Joined: 03-May-2005
# Posted on: 28-Sep-2005 14:44:09   

It's not possible what you want to achieve. Don't confuse DAAB with LLBLGen, DAAB is a database access block and LLBLGen is an O/R mapper, two completely different things.

And a database model which changes constantly is bad design IMHO.

paulshealy
User
Posts: 22
Joined: 13-Jun-2005
# Posted on: 28-Sep-2005 15:39:22   

I understand quite clearly the difference between the DAAB and an O/R mapper. Our problem is that parts of our database schema must change, because different hotels have different requirements as to what data they wish to store. Administrators also require the ability to define computed columns for hotels - this is part of the business requirements. Part of our schema is static, for which we use LLBLGen Pro; part is dynamic, for which we must generate queries. Saying that a schema that changes is a bad design is a failure to understand the problem.

pilotboba
User
Posts: 434
Joined: 05-Aug-2005
# Posted on: 28-Sep-2005 16:14:35   

paulshealy wrote:

I understand quite clearly the difference between the DAAB and an O/R mapper. Our problem is that parts of our database schema must change, because different hotels have different requirements as to what data they wish to store. Administrators also require the ability to define computed columns for hotels - this is part of the business requirements. Part of our schema is static, for which we use LLBLGen Pro; part is dynamic, for which we must generate queries. Saying that a schema that changes is a bad design is a failure to understand the problem.

One thing you might want to consider is storing the non-deterministic schema part of the data in a single field as XML. This will allow you to store the unknown schema in a single field the you can still use will LLBLGen.

We have a simmilar issue in one of our apps, where we allow the customers to extend the schema to store custom data.

It is not a simple issue but sometimes the best way to solve a problem.

BOb

wayne avatar
wayne
User
Posts: 611
Joined: 07-Apr-2004
# Posted on: 28-Sep-2005 16:31:21   

Frans is the better person for this. but let me try.

There might be a way... but it will be difficult.

Most OR Mappers have a DQE(DynamicQueryEngine) that generates sql by using the input like - select fields, predicates, sortorders etc. - This is the information that it gathers prior to generting your DAL or from the structure that you the programmer pass into the classes at runtime - like predicates for example.

You can access the DQE of LLBLGen and should be able to create the dynamic SQL on the fly. But that means that you will need to know that structure of the tables that you wish to work with at runtime so that you can create the nessecary classes.

You can fiqure the structure out by creating views of the system tables or accessing them directly. So basicaly you are going to have to create the structures that the DQE uses.

The DQE (DynamicQueryEngine ) is in the driver assembly like for example - SD.LLBLGen.Pro.DQE.SqlServer.NET11

The DQE takes all your Fields, predicates and connections and creates a query using the syntax specificly for that driver.

paulshealy
User
Posts: 22
Joined: 13-Jun-2005
# Posted on: 28-Sep-2005 16:41:03   

pilotboba wrote:

One thing you might want to consider is storing the non-deterministic schema part of the data in a single field as XML. This will allow you to store the unknown schema in a single field the you can still use will LLBLGen.

We have a simmilar issue in one of our apps, where we allow the customers to extend the schema to store custom data.

It is not a simple issue but sometimes the best way to solve a problem.

BOb

You propose an interesting approach.

Another problem we faced with our tables is that we need the ability to target columns in the non-deterministic schema as part of a query - for example one query is

select ... sum(ColX) ... group by some function of ColX

where ColX is some column an administrator added. So, correct me if I'm wrong, I don't think your approach works for us. I'll remember it for the future though.

We also considered creating a generic table (e.g., Col1, Col2, Col3, ...) and mapping the columns to columns the administrator creates, but that creates too many problems.

jtgooding
User
Posts: 126
Joined: 26-Apr-2004
# Posted on: 28-Sep-2005 16:55:07   

Why not simply use a vertical table instead of a horizontal one?

BaseTable BaseID - uniqueid FieldA - common table data field FieldB etc.

CustomTableType FieldTypeID - unique row ID FieldDescription -- Description of the type of data stored for gui's

CustTableData CustID - uniqie row ID BaseID -- foregin key to basetable FieldTypeID -- foreign key to customtabletype CustValue - custom data value to store

Then your admins can create cany number of additional fields that can be pulled back that are associated, creating a static DB with an unlimited number of additional fields that can be associated to the base structure.

With some views you can flatten your vertical data, or with a 3rd party pivot XP (native in 2005) you can pivot the data to normalize it. Or optionally you can pivot the data in code which is what i tend to do.

John

erichar11
User
Posts: 268
Joined: 08-Dec-2003
# Posted on: 28-Sep-2005 17:16:41   

pmfji,

With some views you can flatten your vertical data, or with a 3rd party pivot XP (native in 2005) you can pivot the data to normalize it. Or optionally you can pivot the data in code which is what i tend to do.

John

John looking for something like you suggested for a project (ie pivoting a table) Could you provide a quick example or walkthrough as to how you pivot your data in code? Does your could take a normal table (nonverticle) table and pivot it? Do you store your results in a verticle table? Or are the pivoit results stored in memory where you just bind it to pl for display? Just looking for some insights here.

Thanks,

Eric

pilotboba
User
Posts: 434
Joined: 05-Aug-2005
# Posted on: 28-Sep-2005 19:23:33   

paulshealy wrote:

pilotboba wrote:

One thing you might want to consider is storing the non-deterministic schema part of the data in a single field as XML. This will allow you to store the unknown schema in a single field the you can still use will LLBLGen.

We have a simmilar issue in one of our apps, where we allow the customers to extend the schema to store custom data.

It is not a simple issue but sometimes the best way to solve a problem.

BOb

You propose an interesting approach.

Another problem we faced with our tables is that we need the ability to target columns in the non-deterministic schema as part of a query - for example one query is

select ... sum(ColX) ... group by some function of ColX

where ColX is some column an administrator added. So, correct me if I'm wrong, I don't think your approach works for us. I'll remember it for the future though.

Well, with SQL 2005 the XML data type is a first class citizen and supports querying the data withing the XML field, so you could support this. Of course, I'm not sure how or if LLBLGen will handle this. Frans would need to add extensions to the predecates to allow querying within the XML data.

Of course, the simple solution, on we have experimented with is bringing the full XML back and using XPath or XQuery to find the data withing the XML. Of course, the XML would be data only about the PK of the record.

For example, we have an employee table. The user wants to add a "field" to store Eye Color. So, we add this to the XML schema stored in the CustomData field in the employee table. The though was then to create some type of meta data which would store the XPath to various data that could be pulled at random.

Working with dynamic data schemas is not a simple thing. wink

BOb

jtgooding
User
Posts: 126
Joined: 26-Apr-2004
# Posted on: 28-Sep-2005 20:17:47   

Eric,

I place a business layer between my ui and the data layer (LLBLGen), My business layer is not a 1 to 1 translation of the LLBLGen entities.

So in this case when I 'load' an entity, I fetch the base table and map it to my business layer properites, then I fetch the associated Custom Table data, then I expose them through properties giving me a flattened view of 2 tables.

Then my Save method on my business object saves my base table and any modified entries for the custom data.

This allows my database to be consistent, while providing the functionality of providing custom data associated with a table. Since I have to modify the UI Layer to show the new field unless I just pass all the custom properties as a collection to a grid for editing (the quick way out of maintaining a form), the 10 lines of code it takes to implement a new property is minimal to the business layer.

Even when taking the quick way out, and passing the collection of additional properties through, I have zero lines of code to rewrite since new properties are table driven, in that case I have a 3rd table which associates fieldtypes with baseobjects so that an individual field type can be used by multiple 'hotels' but not show if it is not associated with it.

Hope that helps, kind of a busy day around here, if I'm not clear I can maybe send you a better example.

John

paulshealy
User
Posts: 22
Joined: 13-Jun-2005
# Posted on: 28-Sep-2005 22:27:32   

jtgooding wrote:

Why not simply use a vertical table instead of a horizontal one?

BaseTable BaseID - uniqueid FieldA - common table data field FieldB etc.

CustomTableType FieldTypeID - unique row ID FieldDescription -- Description of the type of data stored for gui's

CustTableData CustID - uniqie row ID BaseID -- foregin key to basetable FieldTypeID -- foreign key to customtabletype CustValue - custom data value to store

Then your admins can create cany number of additional fields that can be pulled back that are associated, creating a static DB with an unlimited number of additional fields that can be associated to the base structure.

With some views you can flatten your vertical data, or with a 3rd party pivot XP (native in 2005) you can pivot the data to normalize it. Or optionally you can pivot the data in code which is what i tend to do.

John

John,

Thanks for the advice - I haven't used vertical tables before. I can see pivoting the data in code as a nice approach. I do have a question about performance though. Each of our clients has between 200 and 400 fields to be defined; we want to store data for each client for several years for each field; and we expect to have several hundred clients in the future. This gives us on the order of 150 - 250 million rows. Have you taken this approach with this much data, and if so, how did the system perform?

jtgooding
User
Posts: 126
Joined: 26-Apr-2004
# Posted on: 28-Sep-2005 23:04:06   

I've run it up to about 100 million rows and never had an issue, the data is small and the index is also small so it is fairly effeceint, I'm not sure what your hardware is etc. so I would recommend testing.

We run 8 proc 8 gig with 10TB of fiber SAN Storage clustered, but I have used this same model on much lesser hardware as well.

Shouldn't be too difficult to create a small app to populate that much data into a few tables and test performance. I would offer to do it and let you know but unfortunately my boss would scream at me with my current schedule, and there is the whole space issue, we are currently waiting on a SQL Server SAN upgrade to boost us up to 12TB before I would have the space to populate that many rows just for test data.

John