We are designing a database (SQLServer 2005) to store measurement data from our instruments. Every second, each instrument will return 5 different values (all floats) - such as max, min, avg etc. When designing a database, is it better to store all these values in a single table (each row containing a timestamp, type and value), or to store them in individual tables (ie: avgtable, maxtable, mintable) with each row containing a timestamp and value? We will be storing data from up to 100 i开发者_JAVA百科nstruments, and they will be running for months at a time, so the data will grow quite large.
Does one design offer improved performance over the other?
Thanks
It may not be a good idea to store every measurement. Most (not all) time-series data is highly repetetive and often it is sufficient to record only changes in the measurement with a start time and end time, or to adopt other methods of compression and encoding.
Take a look at Process Historian and Complex Event Processing (CEP) systems to understand what different systems are in use and what techniques are available to you. There are lots of tools and technologies used to support your type of scenario. OSISoft, StreamBase and Oracle CEP are some of the software packages available. Since you are a Microsoft customer you might also be interested in Microsoft's CEP offering for SQL Server: StreamInsight.
You're better with all in 1 table otherwise you'd have to join the tables together to look at a single measurement and repeat the date time and possibly other fields which will also lead to a much larger database. Joining tables is the expensive bit. Recommend also using a primary key field such as a bigint IDENTITY(1,1) so you can reference records by ID for faster searching too.
精彩评论