I am importing large excel files into a SQL through a web开发者_运维技巧 application, each column in each row gets converted into a business object and each property for the object.
Since the fie is a csv I am performing a lot of validation in code such as checking property values against database values , if statements, case switches etc...and this is causing a large cpu load when the function is processing.
what would be a better way of processing this data?
Consider uploading the data to staging tables like so and performing the validation using set logic rather than row logic. This avoids having to parse into objects at all and should both shift the CPU overhead to a different server (if your application server is underpowered) and reduce the CPU strain required.
精彩评论