Skip to content

Error Using AWS Built-in Fluent Bit to Send Logs to Elastic Cloud from EKS Fargate

0

I am using EKS Fargate and trying to use AWS built-in Fluent Bit to send logs to Elastic Cloud, but I am encountering the following error:

{ "log": "[2024/05/16 14:35:17] [error] [output:es:es.0] HTTP status=400 URI=/_bulk, response:" }
{
    "log": "{\"error\":{\"root_cause\":[{\"type\":\"x_content_parse_exception\",\"reason\":\"[1:24] Unexpected character ('l' (code 108)): was expecting comma to separate Object entries\\n at [Source: (byte[])\\\"{\\\"create\\\":{\\\"_index\\\":\\\"\\\"leoh-logging\\\"\\\"}}\\\"; line: 1, column: 24]\"}],\"type\":\"x_content_parse_exception\",\"reason\":\"[1:24] Unexpected character ('l' (code 108)): was expecting comma to separate Object entries\\n at [Source: (byte[])\\\"{\\\"create\\\":{\\\"_index\\\":\\\"\\\"leoh-logging\\\"\\\"}}\\\"; line: 1, column: 24]\",\"caused_by\":{\"type\":\"json_parse_exception\",\"reason\":\"Unexpected character ('l' (code 108)): was expecting comma to separate Object entries\\n at [Source: (byte[])\\\"{\\\"create\\\":{\\\"_index\\\":\\\"\\\"leoh-logging\\\"\\\"}}\\\"; line: 1, column: 24]\"}},\"status\":400}"
}

Here is my output configuration:

  [OUTPUT]
  Name  es
  Match *
  Index "leoh-logging"
  Host  ***
  Port  9243
  HTTP_User elastic-beats
  HTTP_Passwd ***
  tls   On
  tls.verify Off
  Suppress_Type_Name On

I have checked my configuration and ensured that the credentials are correct. However, I still receive this error. Has anyone encountered a similar issue or has experience in resolving this error? Any help would be greatly appreciated. Additionally, I would like to inquire about the compatibility of the current version of Fluent Bit with Elastic Cloud. Could you please confirm if there are any know compatibility issues or if there are specific versions recommended for optimal performance?

Thank you very much!

  • please accept the answer if it was useful

3 Answers
3
Accepted Answer

Hello,

To resolve the JSON parsing error when using Fluent Bit to send logs from EKS Fargate to Elastic Cloud, follow these steps:

1.Verify Log Data Format: Ensure your logs are correctly formatted JSON.

2.Escape Special Characters: Ensure special characters in your logs are properly escaped.

3.Correct Fluent Bit Configuration: Update your Fluent Bit configuration to match the correct JSON structure for Elasticsearch.

Here is a corrected example:

[OUTPUT]
    Name            es
    Match           *
    Index           leoh-logging
    Host            your-elasticsearch-host
    Port            9243
    HTTP_User       elastic-beats
    HTTP_Passwd     your-password
    tls             On
    tls.verify      Off
    Suppress_Type_Name On
    Replace_Dots    On
    Logstash_Format On
    Logstash_Prefix leoh-logging
    Logstash_DateFormat %Y.%m.%d

Example Correct JSON Structure Ensure the log structure sent is correct:

{ "index": { "_index": "leoh-logging", "_type": "_doc" } }
{ "message": "your log message", "timestamp": "2024-05-24T10:00:00Z", "level": "info" }

Compatibility Ensure you are using compatible versions of Fluent Bit and Elasticsearch. Check the Fluent Bit documentation for version compatibility. Update Fluent Bit to the latest version if necessary.

EXPERT
answered 2 years ago
EXPERT
reviewed a year ago
  • Thank you very much for your prompt and informative response! Your explanation regarding the JSON parsing error in Fluent Bit and the steps to resolve it were clear and valuable. I especially appreciate the example configuration and the breakdown of the correct JSON structure.

    Upon reviewing my configuration, I identified the mistake – the extra double quote mark (") within the index section. This explains the parsing error. I've corrected the configuration, and now my logs are flowing smoothly to Elastic Cloud.

3

The error you are encountering indicates that Fluent Bit is sending log data to Elasticsearch with a JSON parsing issue. Specifically, the error message points out an unexpected character 'l' at a certain position in the JSON string, which suggests a formatting issue in the data.

Verify Log Data Format. Ensure that the log data being sent to Elasticsearch is properly formatted JSON. Fluent Bit might be sending a malformed JSON string.

Escape Special Characters. If your logs contain special characters or are not properly escaped, it could cause parsing issues. Ensure that your log data is correctly escaped.

Check Fluent Bit Filters. If you use any filters to modify the log data before sending it to Elasticsearch, verify that these filters are correctly processing and formatting the data.

Example Correct JSON Structure

{
  "create": {
    "_index": "leoh-logging",
    "_type": "_doc",
    "_id": "1"
  },
  "log": {
    "message": "your log message",
    "timestamp": "2024-05-24T10:00:00Z",
    "level": "info"
  }
}
EXPERT
answered 2 years ago
EXPERT
reviewed 2 years ago
0

Thank you, @kranthi putti.

I'm also getting the following exception on one of the indices in fluent-bit logs:

{"error":{"root_cause":[{"type":"json_parse_exception", "reason":"Unexpected character ('p' (code 112)): was expecting comma to separate Object entries\n at [Source: REDACTED (`StreamReadFeature.INCLUDE_SOURCE_IN_LOCATION` disabled); line: 1, column: 24]"}],
"type":"json_parse_exception", "reason":"Unexpected character ('p' (code 112)): was expecting comma to separate Object entries\n at [Source: REDACTED (`StreamReadFeature.INCLUDE_SOURCE_IN_LOCATION` disabled); line: 1, column: 24]"},
"status":400}

[2025/02/18 19:46:51] [warn] [engine] failed to flush chunk '1-1739906399.925775362.flb', retry in 1838 seconds: task_id=117,
input=tail.0 > output=es.0 (out_id=0)

Would it also require updating the following "Output" configuration or "Filter" settings as well?

filters: |
[FILTER]
  Name kubernetes
  Match kube.*
  Merge_Log On
  Keep_Log Off
  K8S-Logging.Parser On
  K8S-Logging.Exclude On
  Labels On
  Annotations On

[FILTER]
  Name kubernetes
  Match alphatec.*
  Merge_Log On
  Keep_Log Off
  K8S-Logging.Parser On
  K8S-Logging.Exclude On
  Labels On
  Annotations On

## https://docs.fluentbit.io/manual/pipeline/outputs
outputs: |
[OUTPUT]
  Name es
  Match kube.*
  Host <host>
  Logstash_Format On
  Logstash_Prefix_Key kubernetes.namespace_name
  Logstash_DateFormat %Y.%m
  Retry_Limit False
  Port 443
  tls On
  tls.verify Off
  Suppress_Type_Name On
  AWS_Auth On
  AWS_Region eu-west-2
  AWS_Role_ARN <AWS_Role_ARN>
  AWS_Service_Name es
  Replace_Dots Off
  Trace_Error On

[OUTPUT]
  Name es
  Match host.*
  Host <host>
  Logstash_Format On
  Logstash_Prefix node
  Logstash_DateFormat %Y.%m
  Retry_Limit False
  Port 443
  tls On
  tls.verify Off
  Suppress_Type_Name On
  AWS_Auth On
  AWS_Region eu-west-2
  AWS_Role_ARN <AWS_Role_ARN>
  AWS_Service_Name es
  Replace_Dots Off
  Trace_Error On

answered a year ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.