Tuesday, January 28, 2014

.NET low latency logging. Part 1 - Testing environment, Sample application, Best performance!

Choose the solution for low latency logging with the best performance in .NET Framework

Well known that every complex software service requires logging component to store some significant or debug events during lifetime especially on production environment. If the service is close to real-time and handle dozens of request/response messages in a moment then wasting expensive time on log IO operation is not a rational. In this article I'll try to investigate which log components could be used for .NET Framework based solution, how they performs and how the logging can be improved to achieve the best results.

Testing environment

All tests are built and run on my local PC with the following configuration: Intel Core i5-2500 (3.3Ghz), 16GB RAM, HDD WD 160GB, Windows 7 x64

Sample application

There is a sample application under the spoiler which was used for all tests. The idea is to create several producers (default is 1) that will do logging in different threads. Count of items to be logged is also predefined (default is 10000000) and equally divided by all producers. We'll track logging latency using Stopwatch class with making sure that its Stopwatch.IsHighResolution filed is 'true'. The only place that is changed for different logging components is a line 55 which performs log operation in the current producer.

No logging? Best performance!

Lets start from checking the prime performance of generating 10000000 items by 1, 2, 4, 8, 16, 32 and 64 producers (threads) simultaneously without making any logging. We'll take this performance as an origin. For test results I took the best of 10 independent launches.


  • 90% items were produced with latency less than 31 nanoseconds
  • 10% items were produced with latency 300 nanoseconds
  • Producing time and throughput results are better for cases when the count of produce threads is equal to the count of CPU cores

Now we have ideal results to compare with future investigations!

No comments:

Post a Comment