I'm new to WCF Data Services so I've been playing. After some initial tests I am disappointed by the performance of my test data service.
I realize that because a WCF DS is HTTP-based there is overhead inherent in the protocol but my tests are still way slower than I would expect:
Environment:
- All on one box: Quad core 64-bit laptop with 4GB RAM running W7. Decent machine.
- Small SQL database (SQLExpress 2008 R2) with 16 tables... the table under test has 243 rows.
- Hosted my test service in IIS with all defaults.
Code:
- I've created a Entity Framework model (DataContext) for this database (stock codegen by VS2010).
- I've created a data-service based on this model.
- I've created a client which has a direct service reference (ObjectContext) for this service (stock codegen by VS2010)
- In the client I am also able to call the EF model directly and also use Native SQL (ADO.NET SqlConnection)
Test Plan:
- E开发者_运维百科ach iteration connects to the database (there is an option to reuse connections), queries for all rows in the target table ("EVENTS") and then counts them (thus forcing any deferred fetches to be performaed).
- Run for 25 iterations each for Native SQL (SqlConnection/SqlCommand), Entity Framework (DataContext) and WCF Data Services (ObjectContext).
Results:
- 25 iterations of Native SQL: 436ms
- 25 iterations of Entity Framework: 656ms
- 25 iterations of WCF Data Services: 12110ms
Ouch. That's about 20x slower than EF.
Since WCF Data Services is HTTP, there's no opportunity for HTTP connection reuse, so the client is forced to reconnect to the web server for each iteration. But surely there's more going on here than that.
EF itself is fairly fast and it's the same EF code/model is reused for both the service and the direct-to-EF client tests. There's going to be some overhead for Xml serialization and deserialization in the data-service, but that much!?! I've had good performance with Xml serialization in the past.
I'm going to run some tests with JSON and Protocol-Buffer encodings to see if I can get better performance, but I'm curious if the community has any advice for speeding this up.
I'm not strong with IIS, so perhaps there are some IIS tweaks (caches, connection pools, etc) that can be set to improves this?
Consider deploying as a windows service instead? IIS may have ASAPI filters, rewrite rules, etc that it runs through. even if none of these are active, the IIS pipeline is so long, something may slow you down marginally.
a service should give you a good baseline of how long it takes the request to run, be packed, etc, without the IIS slowdowns
The link below has video that has some interesting WCF benchmarks and comparisons between WCF data services and Entity Framework.
http://www.relationalis.com/articles/2011/4/10/wcf-data-services-overhead-performance.html
I increased performance of our WCF Data Service API by 41% simply by enabling compression. It was really easy to do do. Follow this link that explains what to do on your IIs server: Enabling dynamic compression (gzip, deflate) for WCF Data Feeds, OData and other custom services in IIS7
Don't forget to iisReset after your change!
On the client-side:
// This is your context basically, you should have this code throughout your app.
var context = new YourEntities("YourServiceURL");
context.SendingRequest2 += SendingRequest2;
// Add the following method somewhere in a static utility library
public static void SendingRequest2(object sender, SendingRequest2EventArgs e)
{
var request = ((HttpWebRequestMessage)e.RequestMessage).HttpWebRequest;
request.AutomaticDecompression = DecompressionMethods.GZip | DecompressionMethods.Deflate;
}
Try setting security to "none" in the binding section in the configuration. This should make big improvement.
In order to eliminate most of the connection overhead you can try to batch all operations to the WCF DS to to see if that makes a significant difference.
NorthwindEntities context = new NorthwindEntities(svcUri);
var batchRequests =
new DataServiceRequest[]{someCustomerQuery, someProductsQuery};
var batchResponse = context.ExecuteBatch(batchRequests);
For more info see here.
WCF DataServices are for providing your disparate clients with OpenData protocol; so as you don't have to write/refactor multiple web service methods for each change request. I never advise it to be used if the entire system is microsoft technology stack based. It's meant for remote clients.
How do you pass those 25 iterations for WCF?
var WCFobj = new ...Service();
foreach(var calling in CallList)
WCFobj.Call(...)
If you call like that it means you call WCF 25 times, which consumes too many resources.
For me, I used to build up everything into a DataTable
and user table name to stored procedure I'm calling; DataRow
is params. When calling, just pass the DataTable in encrypted form by using
var table = new DataTable("PROC_CALLING")...
...
StringBuilder sb = new StringBuilder();
var xml = System.Xml.XmlWriter.Create(sb);
table.WriteXml(xml);
var bytes = System.Text.Encoding.UTF8.GetBytes(sb.ToString());
[optional]use GZip to bytes
WCFobj.Call(bytes);
The thing is you pass all 25 calls at once, that can save performance significantly. If the return object is same structure, just pass it as DataTable
in bytes form and convert it back to DataTable
.
I used to implement this methods with GZip for import/export data modules. Passing large amount of bytes is going make WCF unhappy. Its depends whatever you want to consume; computing resources or networking resources.
things to try:
1) results encoding: use binary encoding of your WCF channel if possible, see http://msdn.microsoft.com/en-us/magazine/ee294456.aspx -- alternately use compression: http://programmerpayback.com/2009/02/18/speed-up-your-app-by-compressing-wcf-service-responses/
2) change your service instance behavior, see http://msdn.microsoft.com/en-us/magazine/cc163590.aspx#S6 -- try InstanceContextMode = InstanceContextMode.Single, ConcurrencyMode=ConcurrencyMode.Multiple - if you can verify that your service is built in a thread safe way.
Regarding your benchmark, I think you should simulate more realistic load (including concurrent users) and ignore outliers, the first request to IIS will be really slow (it has to load all the DLLs)
精彩评论