In our application we're considering using dynamically generated classes to hold a lot of our data. The reason for doing this is that we have customers with tables that have different structures. So you could have a customer table called "DOG" (just making this up) that contains the columns "DOGID", "DOGNAME", "DOGTYPE", etc. Customer #2 could have the same table "DOG" with the columns "DOGID", "DOG_FIRST_NAME", "DOG_LAST_NAME", "DOG_BREED", and so on. We can't create classes for these at compile time as the customer can change the table schema at any time.
At the moment I have code that can generate a "DOG" class at run-time using reflection. What I'm trying to figure out is how to populate this class from a DataTable (or some other .NET mechanism) without extreme performance penalties. We have one table that contains ~20 columns and ~50k rows. Doing a foreach over all of the rows and columns to create the collection take about 1 minute, which is a little too long.
Am I trying to come u开发者_如何转开发p with a solution that's too complex or am I on the right track? Has anyone else experienced a problem like this? Create dynamic classes was the solution that a developer at Microsoft proposed. If we can just populate this collection and use it efficiently I think it could work.
What you're trying to do wil get fairly complicted in .NET 3.5. You're probably better off creating a type with a Dictionary<> of key/value pairs for the data in your model.
If you can use .NET 4.0, you could look at using dynamic and ExpandoObject. The dynamic
keyword lets you create reference to truly dynamic objects - ExpandoObject allows you to easily add properties and methods on the fly. All without complicated reflection or codegen logic. The benefit of dynamic types is that the DLR performs some sophisticated caching of the runtime binding information to allow the types to be more performant than regular reflection allows.
If all you need to do is load map the data to existing types, then you should consider using something like EntityFramework or NHibernate to provide ORM behavior for your types.
To deal with the data loading performance, I would suggest you consider bypassing the DataTable altogether and loading your records directly into the types you are generating. I suspect that most of the performance issues are in reading and casting the values from the DataTable where they original reside.
You should use a profiler to figure out, what exactly takes the time, and then optimize on that. If you are using reflection when setting the data, it will be slow. Much can be saved by caching the reflected type data.
I am curious, how will you use these class if the members are not known at compile time ? Perhaps you would be better off with just a plain DataTable for this scenario ?
Check out PicoPoco and Massive. These are Micro ORMS. (light weight opensource ORM frameworks) that you use by simply copying in a single code file into your project.
Massive uses the ExpandoObject and does conversions from IDataRecord to the ExpandoObject wich I think is exactly what you want.
PicoPoco take an existing plain class and dynamically and efficiently generates a method (which it then caches by the way) to load it from a database.
精彩评论