Switch to using the regular @elasticsearch
plugin instead of @elasticsearch_data_stream
. It handles retries and errors much more reliably.
Here’s how to make it behave correctly with data streams:
<match your.tag.here>
@type elasticsearch
host your-es-host
port 9200
scheme https
user elastic
password changeme
index_name logs-yourapp # Your data stream name
type_name _doc
write_operation create # Required for data streams
id_key log_id # Optional: use if your logs have a unique ID
<buffer>
@type file
path /var/log/fluentd/buffer
flush_interval 5s
retry_forever true
chunk_limit_size 1MB
total_limit_size 100MB
overflow_action block
</buffer>
</match>