开发者

How to get at the constituent rows of aggregate/group-by query?

开发者 https://www.devze.com 2023-02-15 20:21 出处:网络
I\'m trying to implement an interface to flip the boolean status of line items I\'m stuck on t开发者_如何学Che right way to update the underlying rows when working with a rowset that is an aggreate.

I'm trying to implement an interface to flip the boolean status of line items

I'm stuck on t开发者_如何学Che right way to update the underlying rows when working with a rowset that is an aggreate.

Table

declare @item table(
    id int not null primary key,
    amount money not null,
    is_paid bit not null,
    client varchar(10) not null)
insert into @item values
    (1, 9.50, 0, 'Client A'), 
    (2, 11.50, 0, 'Client A'),
    (3, 20.00, 1, 'Client B')

Query

select sum(amount) as total_amount, is_paid, client
from @item
group by is_paid, client

Result

How to get at the constituent rows of aggregate/group-by query?

Scenario

Now say the above results were in a grid with a "Pay" button in if is_paid=0.

How to get at the constituent rows of aggregate/group-by query?

A row maps to the IDs of one or more rows to be updated as a result of clicking "Pay".

My first thought was to update like this:

update @item
set is_paid=1
where client='Client A'

But that falls apart (correct me if i'm wrong) the minute an additional row for "Client A" is inserted in between the time the interface is displayed and the time the user presses "Pay".

QUESTION: Since this seems to be a rather simple scenario, what is the typical way to handle it? The only thing I can think of so far is to move the aggregation/grouping to application logic.


You could create at temporary table to store the client_id and item_id. Then select the aggregate by joining that table with your item table. This way, when is_paid = 1, you can update only records in item table that has a corresponding record in the temp table. eg:

// Assuming id in @item has been rename to item_id and client_id has been added to @item

declare @active table(
    client_id int not null,
    item_id int not null
);
insert into @active select client_id, item_id from @item;

select sum(@item.amount) as total_amount, @item.is_paid, @item.client_name
from @item inner join @active in @item.item_id = @active.item_id and @item.client_id = @active.client_id
group by @item.is_paid, @item.client_id
order by @item.client_name

update @item from @item inner join @active on @item.client_id = @active.client_id and @item.item_id = @active.item_id
set is_paid=1
where client='Client A'

Alternatively, you could added a create_time column to @item. This way, you can update only those created before a specific time. eg:

select sum(amount) as total_amount, is_paid, client, max(create_time) as last_time
from @item
group by is_paid, client

update @item
set is_paid=1
where client='Client A' and create_time <= last_time


You are worried that between the time you read the data and update the data it might change?

You will need to start a transaction with REPEATABLE READ isolation level (I think).

http://msdn.microsoft.com/en-us/library/aa259216(v=sql.80).aspx


Your schema is wrong. You need a third linking table that stores a client ID and the client name, and then a clientID column in your item table. You can then update properly:

UPDATE @item SET is_paid = 1
WHERE @item.Clientid = (SELECT client.clientID from Client 
WHERE Client.ClientName = 'Client A') AND @item.ID IN
(1, 2) -- Your list of IDs checked to mark as paid in the grid


I rolled my own answer for this one, and I don't see the downside to it, however I'll welcome anyone pointing it out.

What I like is how I know exactly which rows to update.

declare @MyItem table(
    id int not null primary key,
    amount money not null,
    is_paid bit not null,
    client varchar(10) not null)
insert into @MyItem values
    (1, 9.50, 0, 'Client A'), 
    (2, 11.50, 0, 'Client A'),
    (3, 20.00, 1, 'Client B')
select dbo.SumKeys(id) as [IDs], sum(amount) as total_amount, is_paid, client
from @MyItem
group by is_paid, client

How to get at the constituent rows of aggregate/group-by query?

What I don't like is that it took me more than half a day to get this code working because I was fighting with (to me) the oddities that go with programming against the sql server hosted clr.

Anyway I made an aggregate that puts a comma separated list of IDs right into my query.

using System;
using System.Collections.Generic;
using System.Data.SqlTypes;
using Microsoft.SqlServer.Server;
[Serializable]
[Microsoft.SqlServer.Server.SqlUserDefinedAggregate(
    Format.UserDefined,
    IsInvariantToDuplicates = true,
    IsInvariantToNulls = true,
    MaxByteSize = -1
    )]
public struct SumKeys : IBinarySerialize
{
    private readonly static char sep = ',';
    private SqlString result;
    public void Init()
    {
        result = string.Empty;
    }
    public void Accumulate(SqlInt32 value)
    {
        if (!value.IsNull && !Contains(value))
            this.Add(value);
    }
    private void Add(SqlInt32 value)
    {
        this.result += Wrap(value);
    }
    private void Add(string value)
    {
        Add(Convert.ToInt32(value));
    }
    private static string Wrap(SqlInt32 value)
    {
        return value.Value.ToString() + sep;
    }
    private bool Contains(SqlInt32 value)
    {
        return this.result.Value.Contains(Wrap(value));
    }
    public void Merge(SumKeys group)
    {
        foreach (var value in Items(group))
            if (!this.Contains(value))
                this.Add(value);
    }
    private static IEnumerable<SqlInt32> Items(SumKeys group)
    {
        foreach (var value in group.result.Value.Split(sep))
        {
            int i;
            if (Int32.TryParse(value, out i))
                yield return i;
        }
    }
    public SqlString Terminate()
    {
        return this.result.Value.TrimEnd(sep);
    }
    public void Read(System.IO.BinaryReader r)
    {
        this.result = r.ReadString();
    }
    public void Write(System.IO.BinaryWriter w)
    {
        w.Write(this.result.Value.TrimEnd(sep));
    }
}
0

精彩评论

暂无评论...
验证码 换一张
取 消