WCF Configuration for Large Data

By | September 2, 2014

In one of our projects we had to transfer large data from the server to the client WPF application. In the beginning we had some performance issues. Therefore we decided to do some comparisons between ASP.NET Web API, WCF (with and without protobuf-net).
The configuration:

ASP.NET Web API WCF (Protobuf) WCF
– Protobuf serializer
– IIS compression
– Protobuf serializer
– net.tcp protocol
– WCF 4.5 compression
– Standard serializer
– net.tcp protocol
– WCF 4.5 compression

After the first test with 100k objects we achieved good performance with WebAPI (protobuf) and WCF (protobuf). WCF with standard serializer was very slow. This mainly because the message transferred to the client was much bigger compared to WebAPI/WCF protobuf.

In the second round we compared only ASP.NET WebAPI (protobuf) with WCF (protobuf).
We did tests with 300k, 500k and 800k objects. For each test we called the WebAPI/WCF 5 times and took the avg. response time for the comparison.

While doing this we encountered some interesting issues:

WebAPI WebAPI was very unreliable. We had a lot peaks (“very” high response time). We had to call the WebAPI service 5-10 times to get a meaningful avg. time.
WCF WCF was very reliable. Regardless with how many objects we tested, we got more or less the same avg. response time after hitting the service 5 times.
Of course , the response time increased when we were testing with higher amount of objects. But the important part here is, that the time increased almost linear while WebAPI behavior was unpredictable.

Therefore we decided to use WCF (protobuf). The 2 main reasons are:
– Reliability (as described above)
– Much better performance when dealing with 500k or 800k objects.

Leave a Reply

Your email address will not be published. Required fields are marked *


eight + 7 =