开发者

Save data through a web service using NHibernate?

开发者 https://www.devze.com 2022-12-09 17:47 出处:网络
We currently have an application that retrieves data from the server through a web service and populates a DataSet. Then the users of the API manipulate it through the objects which in turn change the

We currently have an application that retrieves data from the server through a web service and populates a DataSet. Then the users of the API manipulate it through the objects which in turn change the dataset. The changes are then serialized, compressed and sent back to the server to get updated.

However, I have begin using NHibernate within projects and I really like the disconnected nature of the POCO objects. The problem we have now is that our objects are so tied to the internal DataSet that they cannot be used in many situations and we end up making duplicate POCO objects to pass back and forth.

Batch.GetBatch() -> calls to web server and populates an internal dataset
Batch.SaveBatch() -> send changes to web server from dataset 

Is there a way to achieve a similar model that we are using which all database access occurs through a web service but use NHibernate?

Edit 1

I have a partial solution that is working and persisting through a web service but it has two problems.

  1. I have to serialize and send my whole collection and not just changed items
  2. If I try to repopulate the collection upon return my objects then any references I had are lost.

Here is my example solution.

Client Side

public IList<Job> GetAll()
{
    return coreWebService
      .GetJobs()
      .BinaryDeserialize<IList<Job>>();
}

public IList<Job> Save(IList<Job> Jobs)
{
    return coreWebService
             .Save(Jobs.BinarySerialize())
             .BinaryDeserialize<IList<Job>>();
}

Server Side

[WebMethod]
public byte[] GetJobs()
{
    using (ISession session = NHibernateHelper.OpenSession())
    {
        return (from j in session.Linq<Job>()
                select j).ToList().BinarySerialize();
    }
}

[WebMethod]
public byte[] Save(byte[] JobBytes)
{
    var Jobs = JobBytes.BinaryDeserialize<IList<Job>>();

    using (ISession session = NHibernateHelper.OpenSession())
    using (ITransaction transaction = session.BeginTransaction())
    {
        foreach (var job in Jobs)
        {
            session.SaveOrUpdate(job);
        }
        transaction.Commit();
    }

    return Jobs.BinarySerialize();
}

As you can see I am sending th开发者_StackOverflow中文版e whole collection to the server each time and then returning the whole collection. But I'm getting a replaced collection instead of a merged/updated collection. Not to mention the fact that it seems highly inefficient to send all the data back and forth when only part of it could be changed.

Edit 2

I have seen several references on the web for almost a transparent persistent mechanism. I'm not exactly sure if these will work and most of them look highly experimental.

  • ADO.NET Data Services w/NHibernate (Ayende)
  • ADO.NET Data Services w/NHibernate (Wildermuth)
  • Custom Lazy-loadable Business Collections with NHibernate
  • NHibernate and WCF is Not a Perfect Match
  • Spring.NET, NHibernate, WCF Services and Lazy Initialization
  • How to use NHibernate Lazy Initializing Proxies with Web Services or WCF

I'm having a hard time finding a replacement for the DataSet model we are using today. The reason I want to get away from that model is because it takes a lot of work to tie every property of every class to a row/cell of a dataset. Then it also tightly couples all of my classes together.


I've only taken a cursory look at your question, so forgive me if my response is shortsighted but here goes:

I don't think you can logically get away from doing a mapping from domain object to DTO.

By using the domain objects over the wire you are tightly coupling your client and service, part of the reason to have a service in the first place is to promote loose coupling. So that's an immediate issue.

On top of that you're going to end up with a brittle domain logic interface where you can't make changes on the service side without breaking your client.

I suspect your best bet would be to implement a loosely coupled service which implements a REST / or some other loosely coupled interface. You could use a product such as automapper to make the conversions simpler and easier and also flatten data structures as necessary.

At this point I don't know of any way to really cut down the verbosity involved in doing the interface layers but having worked on large projects that didn't make the effort I can honestly tell you the savings wasn't worth it.


I think your issue revolves around this issue:

http://thatextramile.be/blog/2010/05/why-you-shouldnt-expose-your-entities-through-your-services/

Are you or are you not going to send ORM-Entities over the wire?

Since you have a Services-Oriented architecture.. I (like the author) do not recommend this practice.

I use NHibernate. I call those ORM-Entities. They are THE POCO model. But they have "virtual" properties that allow for lazy-loading.

However, I also have some DTO-Objects. These are also POCO's. These do not have lazy'loading friendly properties.

So I do alot of "converting". I hydrate ORM-Entities (with NHibernate)...and then I end up converting them to Domain-DTO-Objects. Yes, it stinks in the beginning.

The server sends out the Domain-DTO-Objects's. There is NO lazy loading. I have to populate them with the "Goldie Locks" "just right" model. Aka, if I need Parent(s) with one level of children, I have to know that up front and send the Domain-DTO-Objects over that way, with just the right amount of hydration.

WHen I send back Domain-DTO-Objects's (from client to the server), I have to reverse the process. I convert the Domain-DTO-Objects into ORM-Entities. And allow NHibernate to work with the ORM-Entities.

Because the architecture is "disconnected", I do alot of (NHiberntae) ".Merge()" calls.

        // ormItem is any NHibernate poco
        using (ISession session = ISessionCreator.OpenSession())
        {
            using (ITransaction transaction = session.BeginTransaction())
            {
                session.BeginTransaction();
                ParkingAreaNHEntity mergedItem = session.Merge(ormItem);
                transaction.Commit();
            }
        }

.Merge is a wonderful thing. Entity Framework does not have it. Boo.

Is this alot of setup? Yes. Do I think it is perfect? No.

However. Because I send very basic DTO's(Poco's) that are not "flavored" to the ORM, I have the ability to switch ORM's without killing my contracts to the outside world.

My datalayer can be ADO.NET, EF, NHibernate, or anything. I have to write the "Converters" if I switch, and the ORM code, but everything else is isolated.

Many people argue with me. They said I'm doing too much, and the ORM-Entities are fine.

Again, I like to "now allow any lazy loading" appearances. And I prefer to have my data-layer isolated. My clients should not know or care about my data-layer/orm of choice.

There are just enough subtle differences (or some not so subtle ones) between EF and NHibernate to screwball the game plan.

Do my Domain-DTO-Objects's look 95% like my ORM-Entities? Yep. But its the 5% that can screwball you.

Moving from DataSets, especially if they are populated from stored-procedures with alot of biz-logic in the TSQL, isn't trivial. But now that I do object model, and I NEVER write a stored procedure that isn't simple CRUD functions, I'd never go back.

And I hate maintenance projects with voodoo TSQL in the stored procedures. It ain't 1999 anymore. Well, most places.

Good luck.

PS Without .Merge(in EF), here is what you have to do in a disconnected world: (boo microsoft)

http://www.entityframeworktutorial.net/EntityFramework4.3/update-many-to-many-entity-using-dbcontext.aspx

0

精彩评论

暂无评论...
验证码 换一张
取 消