Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
9 changes: 5 additions & 4 deletions packages/filestream/agent/input/filestream.yml.hbs
Original file line number Diff line number Diff line change
@@ -1,8 +1,13 @@
{{#if use_logs_stream}}
index: logs
{{else}}

data_stream:
dataset: {{data_stream.dataset}}
{{#if pipeline}}
pipeline: {{pipeline}}
{{/if}}

{{/if}}

paths:
Expand All @@ -14,10 +19,6 @@ paths:
gzip_experimental: true
{{/if}}

{{#if pipeline}}
pipeline: {{pipeline}}
{{/if}}

{{#if recursive_glob}}
prospector.scanner.recursive_glob: {{recursive_glob}}
{{/if}}
Expand Down
5 changes: 5 additions & 0 deletions packages/filestream/changelog.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,9 @@
# newer versions go on top
- version: "2.1.2"
changes:
- description: Ignore "Ingest Pipeline" when "Use Logs Stream" is set
type: bugfix
link: https://github.com/elastic/integrations/pull/15962
- version: "2.1.1"
changes:
- description: Improve documentation for the "Use the logs data stream" option.
Expand Down
8 changes: 4 additions & 4 deletions packages/filestream/manifest.yml
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ name: filestream
title: Custom Logs (Filestream)
description: Collect log data using filestream with Elastic Agent.
type: input
version: 2.1.1
version: 2.1.2
conditions:
kibana:
version: "^9.2.0"
Expand Down Expand Up @@ -42,23 +42,23 @@ policy_templates:
type: bool
title: Use the "logs" data stream
description: |
When enabled, data ingested by this integration is written to the "logs" data stream, and the configured dataset name is ignored. You also need to [Turn on wired streams](https://www.elastic.co/docs/solutions/observability/streams/wired-streams#streams-wired-streams-enable) in Streams [Settings](/app/streams) and to enable **Allow agents to write to Streams** for the output policy in the Fleet [Settings](/app/fleet/settings) tab. [Learn more](https://www.elastic.co/docs/solutions/observability/streams/wired-streams).
When enabled, data ingested by this integration is written to the "logs" data stream. **The 'Ingest Pipeline' and the configured 'Dataset name' are ignored**. You also need to [Turn on wired streams](https://www.elastic.co/docs/solutions/observability/streams/wired-streams#streams-wired-streams-enable) in Streams [Settings](/app/streams) and to enable **Allow agents to write to Streams** for the output policy in the Fleet [Settings](/app/fleet/settings) tab. [Learn more](https://www.elastic.co/docs/solutions/observability/streams/wired-streams).
required: false
show_user: true
default: false
- name: data_stream.dataset
type: text
title: Dataset name
description: |
Dataset to write data to. Changing the dataset will send the data to a different index. You can't use `-` in the name of a dataset and only valid characters for [Elasticsearch index names](https://www.elastic.co/guide/en/elasticsearch/reference/current/docs-index_.html).
Dataset to write data to. Changing the dataset will send the data to a different index. You can't use `-` in the name of a dataset and only valid characters for [Elasticsearch index names](https://www.elastic.co/guide/en/elasticsearch/reference/current/docs-index_.html). **This field is ignored when 'Use the "logs" data stream' is enabled**.
default: filestream.generic
required: true
show_user: true
- name: pipeline
type: text
title: Ingest Pipeline
description: |
The Ingest Node pipeline ID to be used by the integration.
The Ingest Node pipeline ID to be used by the integration. **This field is ignored when 'Use the "logs" data stream' is enabled**.
required: false
show_user: true
- name: parsers
Expand Down