Skip to content

Handling for 413 entity too large in BulkIndexer #210

@mauri870

Description

@mauri870

While investigating https://github.com/elastic/ingest-dev/issues/3677, I noticed that this package does not handle a 413 Entity Too Large status from Elasticsearch. Even with batching, in specific use cases like in Beats (which may include large stack traces in documents), a batch can still exceed http.max_content_length, the maximum request size in Elasticsearch, which defaults to 100MB.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions