Skip to content

Commit 681b6f7

Browse files
author
Pierre Guceski
authored
Merge pull request DataDog#6715 from DataDog/cswatt/container-logs
[DOCS-406] add openshift to container logs and note on container logs to agent page
2 parents 9f84ceb + a8384e8 commit 681b6f7

File tree

18 files changed

+136
-137
lines changed

18 files changed

+136
-137
lines changed

content/en/account_management/billing/_index.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -61,6 +61,7 @@ You can set specific email addresses to receive invoices on the [Plan][10] page
6161
{{< nextlink href="account_management/billing/vsphere/" >}}vSphere integration{{< /nextlink >}}
6262
{{< /whatsnext >}}
6363

64+
6465
[1]: https://app.datadoghq.com/account/usage/hourly
6566
[2]: /infrastructure
6667
[3]: /agent
@@ -71,4 +72,3 @@ You can set specific email addresses to receive invoices on the [Plan][10] page
7172
[8]: https://app.datadoghq.com/account/billing_history
7273
[9]: mailto:billing@datadoghq.com
7374
[10]: https://app.datadoghq.com/account/billing
74-

content/en/account_management/billing/custom_metrics.md

Lines changed: 25 additions & 25 deletions
Original file line numberDiff line numberDiff line change
@@ -160,24 +160,24 @@ These allocations are counted across your entire infrastructure. For example, if
160160

161161
{{< img src="account_management/billing/custom_metrics/host_custom_metrics.png" alt="host_custom_metrics" >}}
162162

163-
The billable number of custom metrics is based on the average number of custom metrics (from all paid hosts) per hour over a given month. Contact [Sales][8] or your [Customer Success][26] Manager to discuss custom metrics for your account or to purchase an additional custom metrics package.
163+
The billable number of custom metrics is based on the average number of custom metrics (from all paid hosts) per hour over a given month. Contact [Sales][8] or your [Customer Success][9] Manager to discuss custom metrics for your account or to purchase an additional custom metrics package.
164164

165165
## Standard integrations
166166

167167
The following standard integrations can potentially emit custom metrics.
168168

169169
| Type of integrations | Integrations |
170170
|------------------------------------------------|------------------------------------------------------------------------------------------------------------------------------------------|
171-
| Limited to 350 custom metrics by default. | [ActiveMQ XML][9] / [Go-Expvar][10] / [Java-JMX][11] |
172-
| No default limit on custom metrics collection. | [Directory][12] /[Linux Proc Extras][13] /[Nagios][14] /[PDH Check][15] /[Prometheus][16] /[SNMP][17] /[Windows Services][18] /[WMI][19] |
173-
| Can be configured to collect custom metrics. | [MySQL][20] /[Oracle][21] /[Postgres][22] /[SQL Server][23] |
174-
| Custom metrics sent from cloud integrations | [AWS][24] |
171+
| Limited to 350 custom metrics by default. | [ActiveMQ XML][10] / [Go-Expvar][11] / [Java-JMX][12] |
172+
| No default limit on custom metrics collection. | [Directory][13] /[Linux Proc Extras][14] /[Nagios][15] /[PDH Check][16] /[Prometheus][17] /[SNMP][18] /[Windows Services][19] /[WMI][20] |
173+
| Can be configured to collect custom metrics. | [MySQL][21] /[Oracle][22] /[Postgres][23] /[SQL Server][24] |
174+
| Custom metrics sent from cloud integrations | [AWS][25] |
175175

176176
## Troubleshooting
177177

178-
For technical questions, contact [Datadog support][25].
178+
For technical questions, contact [Datadog support][26].
179179

180-
For billing questions, contact your [Customer Success][26] Manager.
180+
For billing questions, contact your [Customer Success][9] Manager.
181181

182182
[1]: /integrations
183183
[2]: /developers/metrics/custom_metrics
@@ -187,21 +187,21 @@ For billing questions, contact your [Customer Success][26] Manager.
187187
[6]: /account_management/billing/usage_details
188188
[7]: https://app.datadoghq.com/metric/summary
189189
[8]: mailto:sales@datadoghq.com
190-
[9]: /integrations/activemq/#activemq-xml-integration
191-
[10]: /integrations/go_expvar
192-
[11]: /integrations/java/
193-
[12]: /integrations/directory
194-
[13]: /integrations/linux_proc_extras
195-
[14]: /integrations/nagios
196-
[15]: /integrations/pdh_check
197-
[16]: /integrations/prometheus
198-
[17]: /integrations/snmp
199-
[18]: /integrations/windows_service
200-
[19]: /integrations/wmi_check
201-
[20]: /integrations/mysql
202-
[21]: /integrations/oracle
203-
[22]: /integrations/postgres
204-
[23]: /integrations/sqlserver
205-
[24]: /integrations/amazon_web_services
206-
[25]: /help
207-
[26]: mailto:success@datadoghq.com
190+
[9]: mailto:success@datadoghq.com
191+
[10]: /integrations/activemq/#activemq-xml-integration
192+
[11]: /integrations/go_expvar
193+
[12]: /integrations/java/
194+
[13]: /integrations/directory
195+
[14]: /integrations/linux_proc_extras
196+
[15]: /integrations/nagios
197+
[16]: /integrations/pdh_check
198+
[17]: /integrations/prometheus
199+
[18]: /integrations/snmp
200+
[19]: /integrations/windows_service
201+
[20]: /integrations/wmi_check
202+
[21]: /integrations/mysql
203+
[22]: /integrations/oracle
204+
[23]: /integrations/postgres
205+
[24]: /integrations/sqlserver
206+
[25]: /integrations/amazon_web_services
207+
[26]: /help

content/en/agent/basic_agent_usage/windows.md

Lines changed: 3 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -76,16 +76,15 @@ Each configuration item is added as a property to the command line. The followin
7676
| `PROXY_PORT` | Number | If using a proxy, sets your proxy port. [Learn more about using a proxy with the Datadog Agent][7]. |
7777
| `PROXY_USER` | String | If using a proxy, sets your proxy user. [Learn more about using a proxy with the Datadog Agent][7]. |
7878
| `PROXY_PASSWORD` | String | If using a proxy, sets your proxy password. For the process/container Agent, this variable is required for passing in an authentication password and cannot be renamed. [Learn more about using a proxy with the Datadog Agent][7]. |
79-
| `DDAGENTUSER_NAME` | String | Override the default `ddagentuser` username used during Agent installation _(v6.11.0+)_. [Learn more about the Datadog Windows Agent User][3]. |
80-
| `DDAGENTUSER_PASSWORD` | String | Override the cryptographically secure password generated for the `ddagentuser` user during Agent installation _(v6.11.0+)_. Must be provided for installs on domain servers. [Learn more about the Datadog Windows Agent User][3]. |
79+
| `DDAGENTUSER_NAME` | String | Override the default `ddagentuser` username used during Agent installation _(v6.11.0+)_. [Learn more about the Datadog Windows Agent User][2]. |
80+
| `DDAGENTUSER_PASSWORD` | String | Override the cryptographically secure password generated for the `ddagentuser` user during Agent installation _(v6.11.0+)_. Must be provided for installs on domain servers. [Learn more about the Datadog Windows Agent User][2]. |
8181
| `APPLICATIONDATADIRECTORY` | Path | Override the directory to use for the configuration file directory tree. May only be provided on initial install; not valid for upgrades. Default: `C:\ProgramData\Datadog`. _(v6.11.0+)_ |
8282
| `PROJECTLOCATION` | Path | Override the directory to use for the binary file directory tree. May only be provided on initial install; not valid for upgrades. Default: `%PROGRAMFILES%\Datadog\Datadog Agent`. _(v6.11.0+)_ |
8383

8484
**Note**: If a valid `datadog.yaml` is found and has an API key configured, that file takes precedence over all specified command line options.
8585

8686
[1]: https://s3.amazonaws.com/ddagent-windows-stable/datadog-agent-7-latest.amd64.msi
87-
[2]: /agent/proxy
88-
[3]: /agent/faq/windows-agent-ddagent-user
87+
[2]: /agent/faq/windows-agent-ddagent-user
8988
{{% /tab %}}
9089
{{% tab "Upgrading" %}}
9190

content/en/agent/kubernetes/daemonset_setup.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -242,7 +242,7 @@ The Agent has then two ways to collect logs: from the Docker socket, and from th
242242
* Docker is not the runtime
243243
* More than 10 containers are used within each pod
244244

245-
The Docker API is optimized to get logs from one container at a time. When there are many containers in the same pod, collecting logs through the Docker socket might be consuming much more resources than going through the files:
245+
The Docker API is optimized to get logs from one container at a time. When there are many containers in the same host, collecting logs through the Docker socket might be consuming much more resources and impact your running applications. Therefore, Datadog recommends the K8s file method.
246246

247247
{{< tabs >}}
248248
{{% tab "K8s File" %}}

content/en/agent/logs/_index.md

Lines changed: 10 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -46,9 +46,9 @@ To send logs with environment variables, configure the following:
4646

4747
For more details about the compression perfomances and batching size, refer to the [HTTPS section][2].
4848

49+
4950
[1]: /agent/guide/agent-configuration-files
5051
[2]: /agent/logs/#send-logs-over-https
51-
5252
{{% /tab %}}
5353
{{% tab "HTTP uncompressed" %}}
5454

@@ -91,19 +91,17 @@ To collect logs for a given integration, uncomment the logs section in that inte
9191
Consult the <a href="/integrations/#cat-log-collection">list of supported integrations</a> that include out of the box log configurations.
9292
</div>
9393

94-
See the setup instructions for [Kubernetes][2] or [Docker][3] if you are using a containerized infrastructure.
95-
96-
If an integration does not support logs by default, use the custom log collection.
94+
If you're using Kubernetes, make sure to [enable log collection in your DaemonSet setup][2]. If you're using Docker, [enable log collection for the containerized Agent][3]. For more information about log collection fron containerized environments, refer to the [Countainer Log Collection][4] documentation. If an integration does not support logs by default, use the custom log collection.
9795

9896
## Custom log collection
9997

10098
Datadog Agent v6 can collect logs and forward them to Datadog from files, the network (TCP or UDP), journald, and Windows channels:
10199

102-
1. Create a new `<CUSTOM_LOG_SOURCE>.d/` folder in the `conf.d/` directory at the root of your [Agent's configuration directory][4].
100+
1. Create a new `<CUSTOM_LOG_SOURCE>.d/` folder in the `conf.d/` directory at the root of your [Agent's configuration directory][5].
103101
2. Create a new `conf.yaml` file in this new folder.
104102
3. Add a custom log collection configuration group with the parameters below.
105-
4. [Restart your Agent][5] to take into account this new configuration.
106-
5. Run the [Agent's status subcommand][6] and look for `<CUSTOM_LOG_SOURCE>` under the Checks section.
103+
4. [Restart your Agent][6] to take into account this new configuration.
104+
5. Run the [Agent's status subcommand][7] and look for `<CUSTOM_LOG_SOURCE>` under the Checks section.
107105

108106
Below are examples of custom log collection setup:
109107

@@ -209,12 +207,12 @@ List of all available parameters for log collection:
209207
| `port` | Yes | If `type` is **tcp** or **udp**, set the port for listening to logs. |
210208
| `path` | Yes | If `type` is **file** or **journald**, set the file path for gathering logs. |
211209
| `channel_path` | Yes | If `type` is **windows_event**, list the Windows event channels for collecting logs. |
212-
| `service` | Yes | The name of the service owning the log. If you instrumented your service with [Datadog APM][7], this must be the same service name. |
213-
| `source` | Yes | The attribute that defines which integration is sending the logs. If the logs do not come from an existing integration, then this field may include a custom source name. However, it is recommended that you match this value to the namespace of any related [custom metrics][8] you are collecting, for example: `myapp` from `myapp.request.count`. |
210+
| `service` | Yes | The name of the service owning the log. If you instrumented your service with [Datadog APM][8], this must be the same service name. |
211+
| `source` | Yes | The attribute that defines which integration is sending the logs. If the logs do not come from an existing integration, then this field may include a custom source name. However, it is recommended that you match this value to the namespace of any related [custom metrics][9] you are collecting, for example: `myapp` from `myapp.request.count`. |
214212
| `include_units` | No | If `type` is **journald**, list of the specific journald units to include. |
215213
| `exclude_units` | No | If `type` is **journald**, list of the specific journald units to exclude. |
216214
| `sourcecategory` | No | A multiple value attribute used to refine the source attribute, for example: `source:mongodb, sourcecategory:db_slow_logs`. |
217-
| `tags` | No | A list of tags added to each log collected ([learn more about tagging][9]). |
215+
| `tags` | No | A list of tags added to each log collected ([learn more about tagging][10]). |
218216

219217
## Send logs over HTTPS
220218

@@ -268,8 +266,8 @@ When logs are sent through HTTPS, use the same [set of proxy settings][11] as th
268266
[1]: https://app.datadoghq.com/account/settings#agent
269267
[2]: /agent/kubernetes/daemonset_setup/#log-collection
270268
[3]: /agent/docker/log
271-
[4]: /agent/guide/agent-configuration-files
272-
[5]: /agent/guide/agent-commands/#start-stop-and-restart-the-agent
269+
[4]: /logs/log_collection/#container-log-collection
270+
[5]: /agent/guide/agent-configuration-files
273271
[6]: /agent/guide/agent-commands/#agent-status-and-information
274272
[7]: /tracing
275273
[8]: /developers/metrics/custom_metrics

content/en/api/logs_pipelines/logs_pipelines.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -7,16 +7,16 @@ external_redirect: /api/#logs-pipelines
77

88
## Logs Pipelines
99

10-
<mark>The Logs Pipelines endpoints are not supported in Datadog's client libraries. To request this functionality, contact [Datadog Support][1]. To set your pipeline permissions see the [Granting permissions within limited scopes][4] documentation.</mark>
10+
<mark>The Logs Pipelines endpoints are not supported in Datadog's client libraries. To request this functionality, contact [Datadog Support][1]. To set your pipeline permissions see the [Granting permissions within limited scopes][2] documentation.</mark>
1111

1212
Pipelines and processors operate on incoming logs, parsing and transforming them into structured attributes for easier querying.
1313

14-
* See the [Pipelines Configuration Page][2] for a list of the pipelines and processors currently configured in our UI.
15-
* For more information about Pipelines, see the [Pipeline documentation][3].
14+
* See the [Pipelines Configuration Page][3] for a list of the pipelines and processors currently configured in our UI.
15+
* For more information about Pipelines, see the [Pipeline documentation][4].
1616

1717
**Note**: These endpoints are only available for admin users. Make sure to use an application key created by an admin.
1818

1919
[1]: /help
20-
[2]: https://app.datadoghq.com/logs/pipelines
21-
[3]: /logs/processing
22-
[4]: /account_management/faq/managing-global-role-permissions/#granting-permissions-within-limited-scopes
20+
[2]: /account_management/faq/managing-global-role-permissions/#granting-permissions-within-limited-scopes
21+
[3]: https://app.datadoghq.com/logs/pipelines
22+
[4]: /logs/processing

0 commit comments

Comments
 (0)