You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
*[Define global processing rules](#global-processing-rules)
36
36
37
-
**Note**: If you set up multiple processing rules, they are applied sequentially and each rule is applied on the result of the previous one.
38
-
39
-
**Note**: Processing rule patterns must conform to [Golang regexp syntax][2].
40
-
41
37
To apply a processing rule to all logs collected by a Datadog Agent, see the [Global processing rules](#global-processing-rules) section.
42
38
39
+
**Notes**:
40
+
- If you set up multiple processing rules, they are applied sequentially and each rule is applied on the result of the previous one.
41
+
- Processing rule patterns must conform to [Golang regexp syntax][2].
42
+
- The `log_processing_rules` parameter is used in integration configurations to customize your log collection configuration. While in the Agent's [main configuration][5], the `processing_rules` parameter is used to define global processing rules.
43
+
43
44
## Filter logs
44
45
45
-
To send only a specific subset of logs to Datadog use the `log_processing_rules` parameter in your configuration file with the **exclude_at_match** or **include_at_match**`type`.
46
+
To send only a specific subset of logs to Datadog, use the `log_processing_rules` parameter in your configuration file with the `exclude_at_match` or `include_at_match`type.
|`exclude_at_match`| If the specified pattern is contained in the message, the log is excluded and not sent to Datadog. |
52
53
53
-
For example, to **filter OUT** logs that contain a Datadog email address, use the following `log_processing_rules`:
54
+
For example, to **filter out** logs that contain a Datadog email address, use the following `log_processing_rules`:
54
55
55
56
{{< tabs >}}
56
57
{{% tab "Configuration file" %}}
@@ -141,7 +142,7 @@ spec:
141
142
| `include_at_match` | Only logs with a message that includes the specified pattern are sent to Datadog. If multiple `include_at_match` rules are defined, all rules patterns must match in order for the log to be included. |
142
143
143
144
144
-
For example, to **filter IN** logs that contain a Datadog email address, use the following `log_processing_rules`:
145
+
For example, use the following `log_processing_rules` configuration to **filter in** logs that contain a Datadog email address:
145
146
146
147
{{< tabs >}}
147
148
{{% tab "Configuration file" %}}
@@ -159,7 +160,7 @@ logs:
159
160
pattern: \w+@datadoghq.com
160
161
```
161
162
162
-
If you want to match one or more patterns you must define them in a single expression:
163
+
If you want to match one or more patterns, you must define them in a single expression:
163
164
164
165
```yaml
165
166
logs:
@@ -173,7 +174,7 @@ logs:
173
174
pattern: abc|123
174
175
```
175
176
176
-
If the patterns are too long to fit legibly on a single line you can break them into multiple lines:
177
+
If the patterns are too long to fit legibly on a single line, you can break them into multiple lines:
177
178
178
179
```yaml
179
180
logs:
@@ -192,7 +193,7 @@ logs:
192
193
{{% /tab %}}
193
194
{{% tab "Docker" %}}
194
195
195
-
In a Docker environment, use the label `com.datadoghq.ad.logs` on the **container sending the logs you want to filter** in order to specify the `log_processing_rules`, for example:
196
+
In a Docker environment, use the label `com.datadoghq.ad.logs` on the container that is sending the logs you want to filter, to specify the `log_processing_rules`. For example:
196
197
197
198
```yaml
198
199
labels:
@@ -215,7 +216,7 @@ In a Docker environment, use the label `com.datadoghq.ad.logs` on the **containe
215
216
{{% /tab %}}
216
217
{{% tab "Kubernetes" %}}
217
218
218
-
In a Kubernetes environment, use the pod annotation `ad.datadoghq.com` on your pod to specify the `log_processing_rules`, for example:
219
+
In a Kubernetes environment, use the pod annotation `ad.datadoghq.com` on your pod to specify the `log_processing_rules`. For example:
219
220
220
221
```yaml
221
222
apiVersion: apps/v1
@@ -257,11 +258,11 @@ spec:
257
258
258
259
## Scrub sensitive data from your logs
259
260
260
-
If your logs contain sensitive information that need redacting, configure the Datadog Agent to scrub sensitive sequences by using the `log_processing_rules` parameter in your configuration file with the **mask_sequences** `type`.
261
+
If your logs contain sensitive information that need redacting, configure the Datadog Agent to scrub sensitive sequences by using the `log_processing_rules` parameter in your configuration file with the `mask_sequences` type.
261
262
262
263
This replaces all matched groups with the value of the `replace_placeholder` parameter.
263
264
264
-
For example, redact credit card numbers:
265
+
For example, to redact credit card numbers:
265
266
266
267
{{< tabs >}}
267
268
{{% tab "Configuration file" %}}
@@ -283,7 +284,7 @@ logs:
283
284
{{% /tab %}}
284
285
{{% tab "Docker" %}}
285
286
286
-
In a Docker environment, use the label `com.datadoghq.ad.logs` on your container to specify the `log_processing_rules`, for example:
287
+
In a Docker environment, use the label `com.datadoghq.ad.logs` on your container to specify the `log_processing_rules`. For example:
287
288
288
289
```yaml
289
290
labels:
@@ -307,7 +308,7 @@ In a Docker environment, use the label `com.datadoghq.ad.logs` on your container
307
308
{{% /tab %}}
308
309
{{% tab "Kubernetes" %}}
309
310
310
-
In a Kubernetes environment, use the pod annotation `ad.datadoghq.com` on your pod to specify the `log_processing_rules`, for example:
311
+
In a Kubernetes environment, use the pod annotation `ad.datadoghq.com` on your pod to specify the `log_processing_rules`. For example:
311
312
312
313
```yaml
313
314
apiVersion: apps/v1
@@ -359,7 +360,7 @@ This sends the following log to Datadog: `User email: masked_user@example.com`
359
360
360
361
## Multi-line aggregation
361
362
362
-
If your logs are not sent in JSON and you want to aggregate several lines into a single entry, configure the Datadog Agent to detect a new log using a specific regex pattern instead of having one log per line. This is accomplished by using the `log_processing_rules` parameter in your configuration file with the **multi_line** `type` which aggregates all lines into a single entry until the given pattern is detected again.
363
+
If your logs are not sent in JSON and you want to aggregate several lines into a single entry, configure the Datadog Agent to detect a new log using a specific regex pattern instead of having one log per line. Use the `multi_line` type in the `log_processing_rules` parameter to aggregates all lines into a single entry until the given pattern is detected again.
363
364
364
365
For example, every Java log line starts with a timestamp in `yyyy-dd-mm` format. These lines include a stack trace that can be sent as two logs:
365
366
@@ -391,7 +392,7 @@ logs:
391
392
{{% /tab %}}
392
393
{{% tab "Docker" %}}
393
394
394
-
In a Docker environment, use the label `com.datadoghq.ad.logs` on your container to specify the `log_processing_rules`, for example:
395
+
In a Docker environment, use the label `com.datadoghq.ad.logs` on your container to specify the `log_processing_rules`. For example:
395
396
396
397
```yaml
397
398
labels:
@@ -410,7 +411,7 @@ In a Docker environment, use the label `com.datadoghq.ad.logs` on your container
410
411
{{% /tab %}}
411
412
{{% tab "Kubernetes" %}}
412
413
413
-
In a Kubernetes environment, use the pod annotation `ad.datadoghq.com` on your pod to specify the `log_processing_rules`, for example:
414
+
In a Kubernetes environment, use the pod annotation `ad.datadoghq.com` on your pod to specify the `log_processing_rules`. For example:
414
415
415
416
```yaml
416
417
apiVersion: apps/v1
@@ -502,7 +503,7 @@ logs_config:
502
503
{{% /tab %}}
503
504
{{% tab "Docker" %}}
504
505
505
-
In a Docker environment, use the label `com.datadoghq.ad.logs` on your container to specify the `log_processing_rules`, for example:
506
+
In a Docker environment, use the label `com.datadoghq.ad.logs` on your container to specify the `log_processing_rules`. For example:
Use the `env` parameter in the helm chart to set the `DD_LOGS_CONFIG_PROCESSING_RULES` environment variable to configure global processing rules, for example:
672
+
Use the `env` parameter in the helm chart to set the `DD_LOGS_CONFIG_PROCESSING_RULES` environment variable to configure global processing rules. For example:
0 commit comments