开发者

Storing large number of sensor data records

开发者 https://www.devze.com 2023-02-05 11:32 出处:网络
I need to create a database that saves sensor data that will be queried to generate reports later on (Display a graph and AVG/MAX/MIN values for a given timeframe).

I need to create a database that saves sensor data that will be queried to generate reports later on (Display a graph and AVG/MAX/MIN values for a given timeframe).

The data points look l开发者_开发知识库ike this:

CREATE TABLE [dbo].[Table_1](
    [time] [datetime] NOT NULL,
    [sensor] [int] NOT NULL,
    [value] [decimal](18, 0) NULL
)

Data can be added in intervals ranging from seconds to minutes (depending on the sensor).

Should I worry about my Database growing too big when several years of data accumulate (The DB will run on a MS SQL Server 2008 workgroup edition)?


There are specialized historian databases, such as OSISoft's PI Historian that handle this type of data a lot better than a relational database. With PI you can configure a compression deviation for each data point, such that the data will not be archived unless it changes by at least that compression deviation. When you query for the historical data for a given point, you can ask PI to do interpolation of what the value would have been at the specified time even though your time period is between the archived values.

It's capable of a whole lot more, but you will have to explore that on your own because I don't intend on becoming an OSISoft salesman. However, this is definitely the way you want to go for storing large quantities of sensor data.


It all depends what resources and effort you want to expend on it. At 1 row per second that table would still be less than 0.5GB per sensor per year, which is very small. If you have thousands of sensors then you might want to consider whether to create summary tables to help with the reporting and analysis of the data.

Sensor data like this is often very repetetive. There are more convenient ways to store repeated values - for example by storing one row with a range of times rather than multiple rows with different times.

There are many software packages that can help with storing and managing this kind of time series data. There is also a significant body of research and literature on the subject, which might help you. If you aren't already familiar with it then Google for terms like "Process Historian", "Complex Event Processing" and "SCADA".


It depends on how you're going to use the data, what indexes you add in addition, how many sensors, etc.

That table, as shown, could store 150 million rows (~ 1 sensor x 1 recording per second x 5 years) in ~6GB of space (assuming a heap). The file size limit is 16 terabytes, and I'm not aware of any restrictions on this for Workgroup edition.


If you are worried about the database to grow too big then I would suggest you can have a Archive_Table with the same structure and archive data for an interval like once a month or 6 months(entirely based on the volume of data).

So, this would allow you to have a check on the number of records in your Table. And, of course the archive tables would be available for report generation when you need it.

0

精彩评论

暂无评论...
验证码 换一张
取 消