I have some JSON being emitted from a docker container via the FluentD driver like:
'{"timeMillis":1485917543709,"thread":"main","level":"INFO","loggerName":"com.imageintelligence.ava.api.Boot","message":"{\"dom\":\"DOM\"}","loggerFqcn":"org.apache.logging.slf4j.Log4jLogger","threadId":1,"threadPriority":5}'
Notice the message
field is string encoded JSON? When this data is captured by fluentD, it ends up looking like this, as expected:
2017-02-01 06:29:15 +0000 docker.6faad650faa6: {"log":"{\"timeMillis\":1485917543709,\"thread\":\"main\",\"level\":\"INFO\",\"loggerName\":\"com.imageintelligence.ava.api.Boot\",\"message\":\"{\\\"dom\\\":\\\"DOM\\\"}\",\"loggerFqcn\":\"org.apache.logging.slf4j.Log4jLogger\",\"threadId\":1,\"threadPriority\":5}\r","com.amazonaws.ecs.cluster":"dombou","container_id":"6faad650faa6012af4f32df79901b42488543a5e6e53517fe3579b01ab2b6862","container_name":"/upbeat_booth","source":"stdout"}`
I use a filter like so, to parse the JSON:
<filter docker.**>
@type parser
format json
key_name log
reserve_data true
hash_value_field log
</filter>
and I end up with semi-sanitized JSON:
2017-02-01 06:32:10 +0000 docker.68c794f7f694: {"source":"stdout","log":{"timeMillis":1485917543709,"thread":"main","level":"INFO","loggerName":"com.imageintelligence.ava.api.Boot","message":"{\"dom\":\"DOM\"}","loggerFqcn":"org.apache.logging.slf4j.Log4jLogger","threadId":1,"threadPriority":5},"com.amazonaws.ecs.cluster":"dombou","container_id":"68c794f7f6948d4261b9497947834651abbf766e9aa51a76f39d6895b7a9ac18","container_name":"/sad_hamilton"}
The issue is, the message
field is still a string escaped JSON field. Any advice on how I can parse that inner JSON field as well? How do I stack filters?
You could try sequential filters:
<filter docker.**>
@type parser
key_name log
format json
reserve_data true
</filter>
<filter docker.*.embeded_json.**>
@type parser
key_name message
format json
reserve_data true
</filter>