Nicholas Blumhardt, the creator of various widely used libraries including Serilog, said something that really got me thinking. Let me paraphrase because I don't really remember the exact words he used. But it goes along the lines that in order for log entries to be humanly readable, it often is at the cost of making process/query/filter more complicated, and vice versa.

It is natural for developers to create log entries such as Starting to import items., Item imported in 10ms, Import finished in 1032ms or something like Failed to import item number 24, 'Name' missing. These make it very easy for a developer to understand exactly what's going on when reading the logs, and are especially ideal when it's very easy to add more log lines to get further information that is otherwise lost. When you get a lot of log entries, you end up having to query the database for things like message like 'Failed to import%'.

This gets more and more difficult when more log entry types are added and one might get similar strings that may or may not be related to each other.

Traditional logging frameworks

In the world of .NET there are various frameworks that are commonly used for traditional (i.e. unstructured) logging. Personally I have always ended up using NLog, but I have met a lot of developers that prefer to use Log4Net instead. The difference between these frameworks in terms of feature-set is minimal, but from some tests a colleague of mine performed, it seems that NLog gives us an edge when it comes to performance (post some optimisations).

Obviously there are other frameworks one could look into, but in general the consensus is that you should never write your own logging framework - which is something, unfortunately, I have seen happening way too many times.

Limitations of unstructured logging

Imagine if we have the following log entries:

Access Control: User daryl granted access to module User Management

Access Control: User mark removed access to module User Management

Access Control: User claire granted access to module User Management

If we were to filter using traditional means, we could do something like message like 'Access Control% to get all log entries relating to access control, or message like '%granted access%' to get all log entries that have those words in them. In other words, you end up having to find the right message filter to narrow down to the log entries you are interested in. This could be quite slow in a database with gigabytes of log entries.

This is where structured logging really shines.

Serilog

Serilog - The logging framework built for powerful structured event data

The first thing I noticed when I started using Serilog is how modern it is. Configured is performed fluently and seamlessly. There's a large number of Sinks (plugins/targets) and you are free to use one or more of them in parallel. Most of the parts within Serilog are easily replaceable with a custom implementation. The community is quite active and a lot of these sinks are actually developed by the community. Serilog itself is contributed to very often and they state that they have "Best-in-class .NET Core support.", which is certainly something I look for in the dependencies I use in my projects.

To create log entries, you would do something like this:

_logger.Information("Access Control: User {User} granted access to module {Module}", user, module);

Additional to the traditional log message, this would give you a message template of "Access Control: User {User} granted access to module {Module}", which you can use to filter for all messages that are using that template. But the beneficial part is that you get a JSON object with all the properties you have passed to the logger, so in this case you would have (assuming that both user and module are strings):

{
	"User": "daryl",
	"Module": "User Management"
}

This really enables you to perform some really powerful yet easy logging analysis. There are numerous database engines nowadays that allow you to easily query JSON. For example if you were using PostgreSQL, you could do something like:

SELECT *
FROM Logs
where json_data->>'User' = 'daryl'

There's also the concept of Enrichment, whereby enrichers are used to add further information to the data being logged. A simple example of this is to add Serilog.Enrichers.Thread and when setting up logging, add .Enrich.WithThreadId(). This will add ThreadId to the JSON data being logged, allowing you to filter to that thread if something goes wrong (irrespective of the messages being logged).

Performance

My primary concern when looking into Serilog, (and keep in mind that I am still exploring it and in no way using it in production) is the fact that it's performing serialisation of objects to store them in logs. This is all performed asynchronously. That doesn't mean it magically has no impact on your application, it simply means that it's not blocking you from performing further operations, it's still using CPU cycles to serialise.

Efficient when enabled, extremely low overhead when a logging level is switched off

I took the above quote from their github page because it should be true for all logging frameworks out there. Switching logging off (including if the log entry is below the minimum logged level), should have very little overhead over the application (i.e. return as early as possible).

Personally, I'm willing to take a bit more resources and get all the benefits of structured logging, but I'll be working on creating some benchmarks before I commit myself to it - so that's something I will be posting here.

Final words

Serilog is certainly promising, and I will be diving deeper into the framework and trying out a number of sinks, such as the RabbitMQ sink, as I like the idea of having a stream of logs. It certainly gave me a new perspective into what is being logged and why. Often times we just log for the sake of logging, and don't really think about the benefits we will get from that log entry.

Look for more updates in the coming days/weeks.