Skip to content

Commit 50cc6e0

Browse files
authored
add note and small edits (DataDog#21637)
1 parent 96dcf37 commit 50cc6e0

File tree

1 file changed

+21
-20
lines changed

1 file changed

+21
-20
lines changed

content/en/agent/logs/advanced_log_collection.md

Lines changed: 21 additions & 20 deletions
Original file line numberDiff line numberDiff line change
@@ -34,23 +34,24 @@ After you set up [log collection][1], you can customize your collection configur
3434
* [Specify log file encodings](#log-file-encodings)
3535
* [Define global processing rules](#global-processing-rules)
3636

37-
**Note**: If you set up multiple processing rules, they are applied sequentially and each rule is applied on the result of the previous one.
38-
39-
**Note**: Processing rule patterns must conform to [Golang regexp syntax][2].
40-
4137
To apply a processing rule to all logs collected by a Datadog Agent, see the [Global processing rules](#global-processing-rules) section.
4238

39+
**Notes**:
40+
- If you set up multiple processing rules, they are applied sequentially and each rule is applied on the result of the previous one.
41+
- Processing rule patterns must conform to [Golang regexp syntax][2].
42+
- The `log_processing_rules` parameter is used in integration configurations to customize your log collection configuration. While in the Agent's [main configuration][5], the `processing_rules` parameter is used to define global processing rules.
43+
4344
## Filter logs
4445

45-
To send only a specific subset of logs to Datadog use the `log_processing_rules` parameter in your configuration file with the **exclude_at_match** or **include_at_match** `type`.
46+
To send only a specific subset of logs to Datadog, use the `log_processing_rules` parameter in your configuration file with the `exclude_at_match` or `include_at_match` type.
4647

4748
### Exclude at match
4849

4950
| Parameter | Description |
5051
|--------------------|----------------------------------------------------------------------------------------------------|
5152
| `exclude_at_match` | If the specified pattern is contained in the message, the log is excluded and not sent to Datadog. |
5253

53-
For example, to **filter OUT** logs that contain a Datadog email address, use the following `log_processing_rules`:
54+
For example, to **filter out** logs that contain a Datadog email address, use the following `log_processing_rules`:
5455

5556
{{< tabs >}}
5657
{{% tab "Configuration file" %}}
@@ -141,7 +142,7 @@ spec:
141142
| `include_at_match` | Only logs with a message that includes the specified pattern are sent to Datadog. If multiple `include_at_match` rules are defined, all rules patterns must match in order for the log to be included. |
142143

143144

144-
For example, to **filter IN** logs that contain a Datadog email address, use the following `log_processing_rules`:
145+
For example, use the following `log_processing_rules` configuration to **filter in** logs that contain a Datadog email address:
145146

146147
{{< tabs >}}
147148
{{% tab "Configuration file" %}}
@@ -159,7 +160,7 @@ logs:
159160
pattern: \w+@datadoghq.com
160161
```
161162

162-
If you want to match one or more patterns you must define them in a single expression:
163+
If you want to match one or more patterns, you must define them in a single expression:
163164

164165
```yaml
165166
logs:
@@ -173,7 +174,7 @@ logs:
173174
pattern: abc|123
174175
```
175176

176-
If the patterns are too long to fit legibly on a single line you can break them into multiple lines:
177+
If the patterns are too long to fit legibly on a single line, you can break them into multiple lines:
177178

178179
```yaml
179180
logs:
@@ -192,7 +193,7 @@ logs:
192193
{{% /tab %}}
193194
{{% tab "Docker" %}}
194195

195-
In a Docker environment, use the label `com.datadoghq.ad.logs` on the **container sending the logs you want to filter** in order to specify the `log_processing_rules`, for example:
196+
In a Docker environment, use the label `com.datadoghq.ad.logs` on the container that is sending the logs you want to filter, to specify the `log_processing_rules`. For example:
196197

197198
```yaml
198199
labels:
@@ -215,7 +216,7 @@ In a Docker environment, use the label `com.datadoghq.ad.logs` on the **containe
215216
{{% /tab %}}
216217
{{% tab "Kubernetes" %}}
217218

218-
In a Kubernetes environment, use the pod annotation `ad.datadoghq.com` on your pod to specify the `log_processing_rules`, for example:
219+
In a Kubernetes environment, use the pod annotation `ad.datadoghq.com` on your pod to specify the `log_processing_rules`. For example:
219220

220221
```yaml
221222
apiVersion: apps/v1
@@ -257,11 +258,11 @@ spec:
257258

258259
## Scrub sensitive data from your logs
259260

260-
If your logs contain sensitive information that need redacting, configure the Datadog Agent to scrub sensitive sequences by using the `log_processing_rules` parameter in your configuration file with the **mask_sequences** `type`.
261+
If your logs contain sensitive information that need redacting, configure the Datadog Agent to scrub sensitive sequences by using the `log_processing_rules` parameter in your configuration file with the `mask_sequences` type.
261262

262263
This replaces all matched groups with the value of the `replace_placeholder` parameter.
263264

264-
For example, redact credit card numbers:
265+
For example, to redact credit card numbers:
265266

266267
{{< tabs >}}
267268
{{% tab "Configuration file" %}}
@@ -283,7 +284,7 @@ logs:
283284
{{% /tab %}}
284285
{{% tab "Docker" %}}
285286

286-
In a Docker environment, use the label `com.datadoghq.ad.logs` on your container to specify the `log_processing_rules`, for example:
287+
In a Docker environment, use the label `com.datadoghq.ad.logs` on your container to specify the `log_processing_rules`. For example:
287288

288289
```yaml
289290
labels:
@@ -307,7 +308,7 @@ In a Docker environment, use the label `com.datadoghq.ad.logs` on your container
307308
{{% /tab %}}
308309
{{% tab "Kubernetes" %}}
309310

310-
In a Kubernetes environment, use the pod annotation `ad.datadoghq.com` on your pod to specify the `log_processing_rules`, for example:
311+
In a Kubernetes environment, use the pod annotation `ad.datadoghq.com` on your pod to specify the `log_processing_rules`. For example:
311312

312313
```yaml
313314
apiVersion: apps/v1
@@ -359,7 +360,7 @@ This sends the following log to Datadog: `User email: masked_user@example.com`
359360

360361
## Multi-line aggregation
361362

362-
If your logs are not sent in JSON and you want to aggregate several lines into a single entry, configure the Datadog Agent to detect a new log using a specific regex pattern instead of having one log per line. This is accomplished by using the `log_processing_rules` parameter in your configuration file with the **multi_line** `type` which aggregates all lines into a single entry until the given pattern is detected again.
363+
If your logs are not sent in JSON and you want to aggregate several lines into a single entry, configure the Datadog Agent to detect a new log using a specific regex pattern instead of having one log per line. Use the `multi_line` type in the `log_processing_rules` parameter to aggregates all lines into a single entry until the given pattern is detected again.
363364

364365
For example, every Java log line starts with a timestamp in `yyyy-dd-mm` format. These lines include a stack trace that can be sent as two logs:
365366

@@ -391,7 +392,7 @@ logs:
391392
{{% /tab %}}
392393
{{% tab "Docker" %}}
393394

394-
In a Docker environment, use the label `com.datadoghq.ad.logs` on your container to specify the `log_processing_rules`, for example:
395+
In a Docker environment, use the label `com.datadoghq.ad.logs` on your container to specify the `log_processing_rules`. For example:
395396

396397
```yaml
397398
labels:
@@ -410,7 +411,7 @@ In a Docker environment, use the label `com.datadoghq.ad.logs` on your container
410411
{{% /tab %}}
411412
{{% tab "Kubernetes" %}}
412413

413-
In a Kubernetes environment, use the pod annotation `ad.datadoghq.com` on your pod to specify the `log_processing_rules`, for example:
414+
In a Kubernetes environment, use the pod annotation `ad.datadoghq.com` on your pod to specify the `log_processing_rules`. For example:
414415

415416
```yaml
416417
apiVersion: apps/v1
@@ -502,7 +503,7 @@ logs_config:
502503
{{% /tab %}}
503504
{{% tab "Docker" %}}
504505

505-
In a Docker environment, use the label `com.datadoghq.ad.logs` on your container to specify the `log_processing_rules`, for example:
506+
In a Docker environment, use the label `com.datadoghq.ad.logs` on your container to specify the `log_processing_rules`. For example:
506507

507508
```yaml
508509
labels:
@@ -668,7 +669,7 @@ DD_LOGS_CONFIG_PROCESSING_RULES='[{"type": "mask_sequences", "name": "mask_user_
668669
{{% /tab %}}
669670
{{% tab "Helm" %}}
670671

671-
Use the `env` parameter in the helm chart to set the `DD_LOGS_CONFIG_PROCESSING_RULES` environment variable to configure global processing rules, for example:
672+
Use the `env` parameter in the helm chart to set the `DD_LOGS_CONFIG_PROCESSING_RULES` environment variable to configure global processing rules. For example:
672673

673674
```yaml
674675
env:

0 commit comments

Comments
 (0)