开发者

How to handle multiple storage backends transparently

开发者 https://www.devze.com 2023-02-15 22:47 出处:网络
I\'m working with an application right now that uses a third-party API for handling some batch email-related tasks, and in order for that to work, we need to store some information in this service. Un

I'm working with an application right now that uses a third-party API for handling some batch email-related tasks, and in order for that to work, we need to store some information in this service. Unfortunately, this information (first/last name, email address) is also something we want to use from our application. My normal inclination is to pick one canonical data source and stick with it, but round-tripping to a web service every time I want to look up these fields isn't really a viable option (we use some of them quite a bit), and the service's API requires the records to be stored there, so the duplication is sadly necessary.

But I have no interest in peppering every method throughout our business classes with code to synchronize data to the web service any time they might be updated, and I also don't think my entity should be aware of the service to update itself in a property setter (or whatever else is updating the "truth").

We us开发者_开发知识库e NHibernate for all of our DAL needs, and to my mind, this data replication is really a persistence issue - so I've whipped up a PoC implementation using an EventListener (both PostInsert and PostUpdate) that checks, if the entity is of type X, and any of fields [Y..Z] have been changed, update the web service with the new state.

I feel like this is striking a good balance between ensuring that our data is the canonical source and making sure that it gets replicated transparently and minimizing the chances for changes to fall through the cracks and get us into a mismatch situation (not the end of the world if eg. the service is unreachable, we just do a manual batch update later, but for everybody's sanity in the general case, the goal is that we never have to think about it), but my colleagues and I still have a degree of uncomfortableness with this way forward.

Is this a horrid idea that will invite raptors into my database at inopportune times? Is it a totally reasonable thing to do with an EventListener? Is it a serviceable solution to a less-than-ideal situation that we can just make do with and move on forever tainted? If we soldier on down this road, are there any gotchas I should be wary of in the Events pipeline?


In case of unreliable data stores (web service in your case), I would introduce a concept of transactions (operations) and store them in local database, then periodically pull them from DB and execute against the Web Service (other data store).

Something like this:

    public class OperationContainer
    {
    public Operation Operation; //what ever operations you need CRUD, or some specific
    public object Data; //your entity, business object or whatever
    }

    public class MyMailService
    {
    public SendMail (MailBusinessObject data)
    {
    DataAcceessLair<MailBusinessObject>.Persist(data);
    OperationContainer operation = new OperationContainer(){Operation=insert, Data=data};
    DataAcceessLair<OperationContainer>.Persist(operation);
    }
    }

    public class Updater
    {
    Timer EverySec;
    public void OnEverySec()
    {
    var data = DataAcceessLair<OperationContainer>.GetFirstIn(); //FIFO
    var webServiceData = WebServiceData.Converr(data); // do the logic to prepare data for WebService
try
{
    new WebService().DoSomething(data);
DataAcceessLair<OperationContainer>.Remove(data);
}
    }
    }

This is actually pretty close to the concept of smart client - technically not logicaly. Take a look at book: .NET Domain-Driven Design with C#: Problem-Design-Solution, chapter 10. Or take a look at source code from the book, it's pretty close to your situation: http://dddpds.codeplex.com/

0

精彩评论

暂无评论...
验证码 换一张
取 消