Skip to content

Commit 7408e7a

Browse files
alaudazzishmsr
andauthored
[Phase 1] Rename "Requirements" into "What do I need to use this integration?" (#15320)
* Appy the new documentation guidelines * Update admin_by_request_epm * Update amazon_security_lake * Update apache_spark * Update armis * Update authentik * Update apigateway * Update awshealth * Update billing * Update cloudfront * Update cloudtrail * Update cloudwatch * Update config * Update dynamodb * More edits on dynamodb * Update ebs * Update ec2 * Update ecs * Update elb * Update emr * Update firewall * Update guardduty * Update kafka * Update more aws packages * Update more aws packages * Add CL entries and bump version in manifest * Update changelog entry * Create a new paragraph in the Agentless section --------- Co-authored-by: subham sarkar <subham.sarkar@elastic.co>
1 parent 1c2a5e5 commit 7408e7a

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

102 files changed

+653
-657
lines changed

packages/abnormal_security/_dev/build/docs/README.md

Lines changed: 11 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ Abnormal AI is a behavioral AI-based email security platform that learns the beh
44

55
The Abnormal AI integration collects data for AI Security Mailbox (formerly known as Abuse Mailbox), Audit, Case, and Threat logs using REST API.
66

7-
## Data streams
7+
## What data does this integration collect?
88

99
The Abnormal AI integration collects six types of logs:
1010

@@ -20,13 +20,13 @@ The Abnormal AI integration collects six types of logs:
2020

2121
- **[Vendor Case](https://app.swaggerhub.com/apis-docs/abnormal-security/abx/1.4.3#/Vendors)** - Get details of Abnormal Vendor Cases.
2222

23-
## Requirements
23+
## What do I need to use this integration?
2424

2525
Elastic Agent must be installed. For more details, check the Elastic Agent [installation instructions](docs-content://reference/fleet/install-elastic-agents.md).
2626

27-
## Setup
27+
## How do I deploy this integration?
2828

29-
### To collect data from the Abnormal AI Client API:
29+
### Collect data from the Abnormal AI Client API
3030

3131
#### Step 1: Go to Portal
3232
* Visit the [Abnormal AI Portal](https://portal.abnormalsecurity.com/home/settings/integrations) and click on the `Abnormal REST API` setting.
@@ -37,18 +37,17 @@ Elastic Agent must be installed. For more details, check the Elastic Agent [inst
3737
#### Step 3: IP allowlisting
3838
* Abnormal AI requires you to restrict API access based on source IP. So in order for the integration to work, user needs to update the IP allowlisting to include the external source IP of the endpoint running the integration via Elastic Agent.
3939

40-
### Enabling the integration in Elastic:
40+
### Enable the integration in Elastic
4141

42-
1. In Kibana navigate to Management > Integrations.
43-
2. In "Search for integrations" top bar, search for `Abnormal AI`.
44-
3. Select the "Abnormal AI" integration from the search results.
45-
4. Select "Add Abnormal AI" to add the integration.
46-
5. Add all the required integration configuration parameters, including Access Token, Interval, Initial Interval and Page Size to enable data collection.
47-
6. Select "Save and continue" to save the integration.
42+
1. In Kibana navigate to **Management** > **Integrations**.
43+
2. In the search bar, type **Abnormal AI**.
44+
3. Select the **Abnormal AI** integration and add it.
45+
4. Add all the required integration configuration parameters, including Access Token, Interval, Initial Interval and Page Size to enable data collection.
46+
5. Save the integration.
4847

4948
**Note**: By default, the URL is set to `https://api.abnormalplatform.com`. We have observed that Abnormal AI Base URL changes based on location so find your own base URL.
5049

51-
### Enabling enrichment for Threat events
50+
### Enable enrichment for Threat events
5251

5352
Introduced in version 1.8.0, the Abnormal AI integration includes a new option called `Enable Attachments and Links enrichment` for the Threat data stream. When enabled, this feature enriches incoming threat events with additional details about any attachments and links included in the original message.
5453

packages/abnormal_security/changelog.yml

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,9 @@
11
# newer versions go on top
2+
- version: "1.11.0"
3+
changes:
4+
- description: Improve documentation to align with new guidelines.
5+
type: enhancement
6+
link: https://github.com/elastic/integrations/pull/15320
27
- version: "1.10.1"
38
changes:
49
- description: Fix the precision of large integers in the `ai_security_mailbox_not_analyzed` data stream.

packages/abnormal_security/docs/README.md

Lines changed: 11 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ Abnormal AI is a behavioral AI-based email security platform that learns the beh
44

55
The Abnormal AI integration collects data for AI Security Mailbox (formerly known as Abuse Mailbox), Audit, Case, and Threat logs using REST API.
66

7-
## Data streams
7+
## What data does this integration collect?
88

99
The Abnormal AI integration collects six types of logs:
1010

@@ -20,13 +20,13 @@ The Abnormal AI integration collects six types of logs:
2020

2121
- **[Vendor Case](https://app.swaggerhub.com/apis-docs/abnormal-security/abx/1.4.3#/Vendors)** - Get details of Abnormal Vendor Cases.
2222

23-
## Requirements
23+
## What do I need to use this integration?
2424

2525
Elastic Agent must be installed. For more details, check the Elastic Agent [installation instructions](docs-content://reference/fleet/install-elastic-agents.md).
2626

27-
## Setup
27+
## How do I deploy this integration?
2828

29-
### To collect data from the Abnormal AI Client API:
29+
### Collect data from the Abnormal AI Client API
3030

3131
#### Step 1: Go to Portal
3232
* Visit the [Abnormal AI Portal](https://portal.abnormalsecurity.com/home/settings/integrations) and click on the `Abnormal REST API` setting.
@@ -37,18 +37,17 @@ Elastic Agent must be installed. For more details, check the Elastic Agent [inst
3737
#### Step 3: IP allowlisting
3838
* Abnormal AI requires you to restrict API access based on source IP. So in order for the integration to work, user needs to update the IP allowlisting to include the external source IP of the endpoint running the integration via Elastic Agent.
3939

40-
### Enabling the integration in Elastic:
40+
### Enable the integration in Elastic
4141

42-
1. In Kibana navigate to Management > Integrations.
43-
2. In "Search for integrations" top bar, search for `Abnormal AI`.
44-
3. Select the "Abnormal AI" integration from the search results.
45-
4. Select "Add Abnormal AI" to add the integration.
46-
5. Add all the required integration configuration parameters, including Access Token, Interval, Initial Interval and Page Size to enable data collection.
47-
6. Select "Save and continue" to save the integration.
42+
1. In Kibana navigate to **Management** > **Integrations**.
43+
2. In the search bar, type **Abnormal AI**.
44+
3. Select the **Abnormal AI** integration and add it.
45+
4. Add all the required integration configuration parameters, including Access Token, Interval, Initial Interval and Page Size to enable data collection.
46+
5. Save the integration.
4847

4948
**Note**: By default, the URL is set to `https://api.abnormalplatform.com`. We have observed that Abnormal AI Base URL changes based on location so find your own base URL.
5049

51-
### Enabling enrichment for Threat events
50+
### Enable enrichment for Threat events
5251

5352
Introduced in version 1.8.0, the Abnormal AI integration includes a new option called `Enable Attachments and Links enrichment` for the Threat data stream. When enabled, this feature enriches incoming threat events with additional details about any attachments and links included in the original message.
5453

packages/abnormal_security/manifest.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
format_version: 3.4.0
22
name: abnormal_security
33
title: Abnormal AI
4-
version: "1.10.1"
4+
version: "1.11.0"
55
description: Collect logs from Abnormal AI with Elastic Agent.
66
type: integration
77
categories:

packages/admin_by_request_epm/_dev/build/docs/README.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22

33
The Elastic integration for [Admin By Request EPM](https://www.adminbyrequest.com/en/endpoint-privilege-management) enables real-time monitoring and analysis of audit logging of privilege elevations, software installations and administrative actions through user portal. This integration collects, processes, and visualizes audit logs and events to enhance security posture, compliance, and operational efficiency.
44

5-
## Data Streams
5+
## What data does this integration collect?
66

77
- **`auditlog`**: Provides audit data that includes elevation requests, approvals, application installations, and scan results.
88
- [Auditlog](https://www.adminbyrequest.com/en/docs/auditlog-api) are records generated when user takes action such as installing a software, running an application with admin privileges, requesting for admin session, approval or denial of requests and scan results.
@@ -12,7 +12,7 @@ The Elastic integration for [Admin By Request EPM](https://www.adminbyrequest.co
1212
- [Events](https://www.adminbyrequest.com/en/docs/events-api) are records that are generated on various actions done by users and administrators. These include group modifications, policy changes, security violations, and other administrative activities.
1313
- This data stream leverages the Admin By Request EPM API [`/events`](https://www.adminbyrequest.com/en/docs/events-api) endpoint to retrieve data.
1414

15-
## Requirements
15+
## What do I need to use this integration?
1616

1717
Elastic Agent must be installed. For more details, check the Elastic Agent [installation instructions](docs-content://reference/fleet/install-elastic-agents.md).
1818

@@ -36,7 +36,7 @@ Auditlog documents can be found by setting the following filter:
3636

3737
**ECS Field Reference**
3838

39-
Please refer to the following [document](https://www.elastic.co/guide/en/ecs/current/ecs-field-reference.html) for detailed information on ECS fields.
39+
Refer to the following [document](https://www.elastic.co/guide/en/ecs/current/ecs-field-reference.html) for detailed information on ECS fields.
4040

4141
The following non-ECS fields are used in events documents:
4242

@@ -52,10 +52,10 @@ Event documents can be found by setting the following filter:
5252

5353
**ECS Field Reference**
5454

55-
Please refer to the following [document](https://www.elastic.co/guide/en/ecs/current/ecs-field-reference.html) for detailed information on ECS fields.
55+
Refer to the following [document](https://www.elastic.co/guide/en/ecs/current/ecs-field-reference.html) for detailed information on ECS fields.
5656

5757
The following non-ECS fields are used in events documents:
5858

5959
{{fields "events"}}
6060

61-
Events Data stream has field `eventCode` which is a unique identifier for each event type. Please refer to the Event Codes table given on the [Events API documentation](https://www.adminbyrequest.com/en/docs/events-api) for more information on event codes.
61+
Events Data stream has field `eventCode` which is a unique identifier for each event type. Refer to the Event Codes table given on the [Events API documentation](https://www.adminbyrequest.com/en/docs/events-api) for more information on event codes.

packages/admin_by_request_epm/changelog.yml

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,8 @@
1+
- version: "1.1.0"
2+
changes:
3+
- description: Improve documentation to align with new guidelines.
4+
type: enhancement
5+
link: https://github.com/elastic/integrations/pull/15320
16
- version: "1.0.0"
27
changes:
38
- description: Release package as GA.

packages/admin_by_request_epm/docs/README.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22

33
The Elastic integration for [Admin By Request EPM](https://www.adminbyrequest.com/en/endpoint-privilege-management) enables real-time monitoring and analysis of audit logging of privilege elevations, software installations and administrative actions through user portal. This integration collects, processes, and visualizes audit logs and events to enhance security posture, compliance, and operational efficiency.
44

5-
## Data Streams
5+
## What data does this integration collect?
66

77
- **`auditlog`**: Provides audit data that includes elevation requests, approvals, application installations, and scan results.
88
- [Auditlog](https://www.adminbyrequest.com/en/docs/auditlog-api) are records generated when user takes action such as installing a software, running an application with admin privileges, requesting for admin session, approval or denial of requests and scan results.
@@ -12,7 +12,7 @@ The Elastic integration for [Admin By Request EPM](https://www.adminbyrequest.co
1212
- [Events](https://www.adminbyrequest.com/en/docs/events-api) are records that are generated on various actions done by users and administrators. These include group modifications, policy changes, security violations, and other administrative activities.
1313
- This data stream leverages the Admin By Request EPM API [`/events`](https://www.adminbyrequest.com/en/docs/events-api) endpoint to retrieve data.
1414

15-
## Requirements
15+
## What do I need to use this integration?
1616

1717
Elastic Agent must be installed. For more details, check the Elastic Agent [installation instructions](docs-content://reference/fleet/install-elastic-agents.md).
1818

@@ -135,7 +135,7 @@ An example event for `auditlog` looks as following:
135135

136136
**ECS Field Reference**
137137

138-
Please refer to the following [document](https://www.elastic.co/guide/en/ecs/current/ecs-field-reference.html) for detailed information on ECS fields.
138+
Refer to the following [document](https://www.elastic.co/guide/en/ecs/current/ecs-field-reference.html) for detailed information on ECS fields.
139139

140140
The following non-ECS fields are used in events documents:
141141

@@ -310,7 +310,7 @@ An example event for `events` looks as following:
310310

311311
**ECS Field Reference**
312312

313-
Please refer to the following [document](https://www.elastic.co/guide/en/ecs/current/ecs-field-reference.html) for detailed information on ECS fields.
313+
Refer to the following [document](https://www.elastic.co/guide/en/ecs/current/ecs-field-reference.html) for detailed information on ECS fields.
314314

315315
The following non-ECS fields are used in events documents:
316316

@@ -360,4 +360,4 @@ The following non-ECS fields are used in events documents:
360360
| input.type | Input type | keyword |
361361

362362

363-
Events Data stream has field `eventCode` which is a unique identifier for each event type. Please refer to the Event Codes table given on the [Events API documentation](https://www.adminbyrequest.com/en/docs/events-api) for more information on event codes.
363+
Events Data stream has field `eventCode` which is a unique identifier for each event type. Refer to the Event Codes table given on the [Events API documentation](https://www.adminbyrequest.com/en/docs/events-api) for more information on event codes.

packages/admin_by_request_epm/manifest.yml

Lines changed: 2 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
format_version: 3.3.0
22
name: admin_by_request_epm
33
title: Admin By Request EPM
4-
version: "1.0.0"
4+
version: "1.1.0"
55
source:
66
license: "Elastic-2.0"
77
description: "Collect logs from Admin By Request EPM with Elastic Agent."
@@ -60,11 +60,7 @@ policy_templates:
6060
required: false
6161
show_user: false
6262
description: >-
63-
The request tracer logs requests and responses to the agent's local file-system for debugging configurations.
64-
Enabling this request tracing compromises security and should only be used for debugging. Disabling the request
65-
tracer will delete any stored traces.
66-
See [documentation](https://www.elastic.co/guide/en/beats/filebeat/current/filebeat-input-cel.html#_resource_tracer_enable)
67-
for details.
63+
The request tracer logs requests and responses to the agent's local file-system for debugging configurations. Enabling this request tracing compromises security and should only be used for debugging. Disabling the request tracer will delete any stored traces. See [documentation](https://www.elastic.co/guide/en/beats/filebeat/current/filebeat-input-cel.html#_resource_tracer_enable) for details.
6864
- name: http_client_timeout
6965
type: text
7066
title: HTTP Client Timeout

packages/amazon_security_lake/_dev/build/docs/README.md

Lines changed: 29 additions & 32 deletions
Original file line numberDiff line numberDiff line change
@@ -12,58 +12,55 @@ The Amazon Security Lake integration can be used in two different modes to colle
1212

1313
This module follows the OCSF Schema Version **v1.1.0**.
1414

15-
## Data streams
15+
## What data does this integration collect?
1616

1717
The Amazon Security Lake integration collects logs from both [Third-party services](https://docs.aws.amazon.com/security-lake/latest/userguide/integrations-third-party.html) and [AWS services](https://docs.aws.amazon.com/security-lake/latest/userguide/open-cybersecurity-schema-framework.html) in an event data stream.
1818

19-
### **NOTE**:
19+
**NOTE**:
2020
- The Amazon Security Lake integration supports events collected from [AWS services](https://docs.aws.amazon.com/security-lake/latest/userguide/internal-sources.html) and [third-party services](https://docs.aws.amazon.com/security-lake/latest/userguide/custom-sources.html).
2121

2222
- Due to the nature and structure of the OCSF schema, this integration has limitations on how deep the mappings run. Some important objects like 'Actor', 'User' and 'Product' have more fleshed-out mappings compared to others which get flattened after the initial 2-3 levels of nesting to keep them maintainable and stay within field mapping [limits](https://www.elastic.co/guide/en/elasticsearch/reference/current/mapping-settings-limit.html). This will evolve as needed.
2323

24-
## Requirements
24+
## What do I need to use this integration?
2525

26-
- Elastic Agent must be installed.
27-
- Elastic Agent is required to stream data from Amazon Security Lake and ship the data to Elastic, where the events will then be processed via the integration's ingest pipelines.
26+
Elastic Agent must be installed. For more details, check the Elastic Agent [installation instructions](docs-content://reference/fleet/install-elastic-agents.md). Elastic Agent is required to stream data from Amazon Security Lake and ship the data to Elastic, where the events will then be processed via the integration's ingest pipelines.
2827

2928
## Setup
3029

31-
### To collect data from Amazon Security Lake follow the below steps:
32-
33-
1. To enable and start Amazon Security Lake, follow the steps mentioned here: [`https://docs.aws.amazon.com/security-lake/latest/userguide/getting-started.html`](https://docs.aws.amazon.com/security-lake/latest/userguide/getting-started.html).
34-
2. After creating the data lake, follow the steps below to create data subscribers to consume data.
35-
- Open the [Security Lake console](https://console.aws.amazon.com/securitylake/).
36-
- By using the AWS Region selector in the upper-right corner of the page, select the Region where you want to create the subscriber.
37-
- In the navigation pane, choose **Subscribers**.
38-
- On the Subscribers page, choose **Create subscriber**.
39-
- For **Subscriber details**, enter **Subscriber name** and an optional Description.
40-
- For **Log and event sources**, choose which sources the subscriber is authorized to consume.
41-
- For **Data access method**, choose **S3** to set up data access for the subscriber.
42-
- For **Subscriber credentials**, provide the subscriber's **AWS account ID** and **external ID**.
43-
- For **Notification details**, select **SQS queue**.
44-
- Choose Create.
45-
3. Above mentioned steps will create and provide the required details such as IAM roles/AWS role ID, external ID and queue URL to configure AWS Security Lake Integration.
46-
47-
### Enabling the integration in Elastic:
48-
49-
1. In Kibana go to Management > Integrations.
50-
2. In "Search for integrations" search bar, type Amazon Security Lake.
51-
![Search](../img/search.png)
52-
3. Click on the "Amazon Security Lake" integration from the search results.
53-
4. Click on the Add Amazon Security Lake Integration button to add the integration.
54-
![Home Page](../img/home_page.png)
55-
5. By default collect logs via S3 Bucket toggle will be off and collect logs for AWS SQS.
30+
### Collect data from Amazon Security Lake
31+
32+
To enable and start Amazon Security Lake, refer to the [AWS getting started](https://docs.aws.amazon.com/security-lake/latest/userguide/getting-started.html).
33+
34+
To create and provide the required details such as IAM roles/AWS role ID, external ID and queue URL to configure AWS Security Lake Integration, follow these steps:
35+
36+
1. Open the [Security Lake console](https://console.aws.amazon.com/securitylake/).
37+
2. By using the AWS Region selector in the upper-right corner of the page, select the region where you want to create the subscriber.
38+
3. In the navigation pane, choose **Subscribers**.
39+
4. On the Subscribers page, choose **Create subscriber**.
40+
5. In **Subscriber details**, enter **Subscriber name** and an optional description.
41+
6. In **Log and event sources**, choose which sources the subscriber is authorized to consume.
42+
7. In **Data access method**, choose **S3** to set up data access for the subscriber.
43+
8. For **Subscriber credentials**, provide the subscriber's **AWS account ID** and **external ID**.
44+
9. For **Notification details**, select **SQS queue**.
45+
10. Click **Create**.
46+
47+
### Enable the integration in Elastic
48+
49+
1. In Kibana navigate to **Management** > **Integrations**.
50+
2. In the search bar, type **Amazon Security Lake**.
51+
3. Select the **Amazon Security Lake** integration and add it.
52+
4. By default collect logs via S3 Bucket toggle will be off and collect logs for AWS SQS.
5653
- queue url
5754
![Queue URL](../img/queue_url.png)
5855
- collect logs via S3 Bucket toggled off
5956
- role ARN
6057
- external id
6158
![Role ARN and External ID](../img/role_arn_and_external_id.png)
62-
63-
6. If you want to collect logs via AWS S3, then you have to put the following details:
59+
5. If you want to collect logs via AWS S3, then you have to put the following details:
6460
- bucket ARN or access point ARN
6561
- role ARN
6662
- external id
63+
5. Save the integration.
6764

6865
**NOTE**:
6966

0 commit comments

Comments
 (0)