Skip to content

Commit 283ec3f

Browse files
authored
Merge pull request DataDog#6586 from DataDog/estib/logs-azure-archives
document instructions for how to configure and rehydrate from an azur…
2 parents 020ff31 + 4459970 commit 283ec3f

File tree

4 files changed

+31
-1
lines changed

4 files changed

+31
-1
lines changed

content/en/logs/archives/_index.md

Lines changed: 21 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -86,6 +86,26 @@ To add server side encryption to your S3 log archives, go to the **Properties**
8686
[6]: https://app.datadoghq.com/logs/pipelines/archives
8787
{{% /tab %}}
8888

89+
{{% tab "Azure Storage" %}}
90+
91+
1. Go to your [Azure Portal][1] and [create a storage account][2] to send your archives to. Give your storage account a name, any account kind, and select the **hot** access tier.
92+
2. Set up the [Azure integration][3] within the subscription that holds your new storage account, if you haven't already. This involves [creating an App Registration that Datadog can use][4] to integrate with.
93+
3. Next, grant your Datadog App sufficient permission to write to and rehydrate from your storage account. Select your storage account from the [Storage Accounts page][1], go to **Access Control (IAM)**, and select **Add -> Add Role Assignment**. Input the Role called **Storage Blob Data Contributor**, select the Datadog App that you created for integrating with Azure, and save.
94+
{{< img src="logs/archives/logs_azure_archive_permissions.png" alt="Add the Storage Blob Data Contributor role to your Datadog App." style="width:75%;">}}
95+
4. Go to your [Archives page][5] in Datadog, and select the **Add a new archive** option at the bottom. Only Datadog users with admin status can complete this and the following step.
96+
5. Select the **Azure Storage** archive type, and the Azure Tenant and Client for the Datadog App that has the Storage Blob Data Contributor role on your storage account. Input your storage account name and a container name for your archive.
97+
6. **Optional**: input a prefix directory for all the content of your log archives.
98+
7. Save your archive.
99+
100+
{{< img src="logs/archives/logs_azure_archive_configs.png" alt="Set your Azure storage account info in Datadog" style="width:75%;">}}
101+
102+
[1]: https://portal.azure.com/#blade/HubsExtension/BrowseResource/resourceType/Microsoft.Storage%2FStorageAccounts
103+
[2]: https://docs.microsoft.com/en-us/azure/storage/common/storage-account-create?tabs=azure-portal
104+
[3]: https://app.datadoghq.com/account/settings#integrations/azure
105+
[4]: /integrations/azure/?tab=azurecliv20#integrating-through-the-azure-portal
106+
[5]: https://app.datadoghq.com/logs/pipelines/archives
107+
{{% /tab %}}
108+
89109
{{% tab "Google Cloud Storage" %}}
90110

91111
1. Go to your [GCP account][1] and [create a GCS bucket][2] to send your archives to. Under "Choose how to control access to objects", select "Set object-level and bucket-level permissions."
@@ -95,7 +115,7 @@ To add server side encryption to your S3 log archives, go to the **Properties**
95115
4. Go to your [Archives page][7] in Datadog, and select the **Add a new archive** option at the bottom. Only Datadog users with admin status can complete this and the following step.
96116
5. Select the GCS archive type, and the GCS Service Account that has permissions to write on your storage bucket. Input your bucket name. Optional: input a prefix directory for all the content of your log archives. Then save your archive.
97117

98-
{{< img src="logs/archives/archive_select_gcs.png" alt="Add the Storage Object Creator role to your Datadogh GCP Service Account." style="width:75%;">}}
118+
{{< img src="logs/archives/archive_select_gcs.png" alt="Set your GCP bucket info in Datadog" style="width:75%;">}}
99119

100120
[1]: https://console.cloud.google.com/storage
101121
[2]: https://cloud.google.com/storage/docs/quickstart-console

content/en/logs/archives/rehydrating.md

Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -108,6 +108,16 @@ Datadog only supports rehydrating from archives that have been configured to use
108108
[3]: https://app.datadoghq.com/logs/pipelines/archives
109109
{{% /tab %}}
110110

111+
{{% tab "Azure Storage" %}}
112+
113+
Datadog uses an Azure AD group with the Storage Blob Data Contributor role scoped to your archives' storage account to rehydrate log events. You can grant this role to your Datadog service account from your storage account's Access Control (IAM) page by [assigning the Storage Blob Data Contributor role to your Datadog integration app][1].
114+
115+
{{< img src="logs/archives/logs_azure_archive_permissions.png" alt="Rehydration from Azure Storage requires the Storage Blob Data Contributor role" style="width:75%;">}}
116+
117+
[1]: /logs/archives/?tab=azurestorage#create-and-configure-a-storage-bucket
118+
119+
{{% /tab %}}
120+
111121
{{% tab "Google Cloud Storage" %}}
112122

113123
In order to rehydrate log events from your archives, Datadog uses a service account with the Storage Object Viewer role. You can grant this role to your Datadog service account from the [GCP IAM Admin page][1] by editing the service account's permissions, adding another role, and then selecting Storage > Storage Object Viewer.
144 KB
Loading
325 KB
Loading

0 commit comments

Comments
 (0)