79703102

Date: 2025-07-16 08:11:38
Score: 0.5
Natty:
Report link

Switch to using the regular @elasticsearch plugin instead of @elasticsearch_data_stream. It handles retries and errors much more reliably.

Here’s how to make it behave correctly with data streams:

<match your.tag.here>
  @type elasticsearch
  host your-es-host
  port 9200
  scheme https
  user elastic
  password changeme

  index_name logs-yourapp           # Your data stream name
  type_name _doc
  write_operation create            # Required for data streams
  id_key log_id                     # Optional: use if your logs have a unique ID
  <buffer>
    @type file
    path /var/log/fluentd/buffer
    flush_interval 5s
    retry_forever true
    chunk_limit_size 1MB
    total_limit_size 100MB
    overflow_action block
  </buffer>
</match>
Reasons:
  • Long answer (-0.5):
  • Has code block (-0.5):
  • Self-answer (0.5):
  • Low reputation (1):
Posted by: David Belhamou