You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: packages/abnormal_security/_dev/build/docs/README.md
+11-12Lines changed: 11 additions & 12 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,7 +4,7 @@ Abnormal AI is a behavioral AI-based email security platform that learns the beh
4
4
5
5
The Abnormal AI integration collects data for AI Security Mailbox (formerly known as Abuse Mailbox), Audit, Case, and Threat logs using REST API.
6
6
7
-
## Data streams
7
+
## What data does this integration collect?
8
8
9
9
The Abnormal AI integration collects six types of logs:
10
10
@@ -20,13 +20,13 @@ The Abnormal AI integration collects six types of logs:
20
20
21
21
-**[Vendor Case](https://app.swaggerhub.com/apis-docs/abnormal-security/abx/1.4.3#/Vendors)** - Get details of Abnormal Vendor Cases.
22
22
23
-
## Requirements
23
+
## What do I need to use this integration?
24
24
25
25
Elastic Agent must be installed. For more details, check the Elastic Agent [installation instructions](docs-content://reference/fleet/install-elastic-agents.md).
26
26
27
-
## Setup
27
+
## How do I deploy this integration?
28
28
29
-
### To collect data from the Abnormal AI Client API:
29
+
### Collect data from the Abnormal AI Client API
30
30
31
31
#### Step 1: Go to Portal
32
32
* Visit the [Abnormal AI Portal](https://portal.abnormalsecurity.com/home/settings/integrations) and click on the `Abnormal REST API` setting.
@@ -37,18 +37,17 @@ Elastic Agent must be installed. For more details, check the Elastic Agent [inst
37
37
#### Step 3: IP allowlisting
38
38
* Abnormal AI requires you to restrict API access based on source IP. So in order for the integration to work, user needs to update the IP allowlisting to include the external source IP of the endpoint running the integration via Elastic Agent.
39
39
40
-
### Enabling the integration in Elastic:
40
+
### Enable the integration in Elastic
41
41
42
-
1. In Kibana navigate to Management > Integrations.
43
-
2. In "Search for integrations" top bar, search for `Abnormal AI`.
44
-
3. Select the "Abnormal AI" integration from the search results.
45
-
4. Select "Add Abnormal AI" to add the integration.
46
-
5. Add all the required integration configuration parameters, including Access Token, Interval, Initial Interval and Page Size to enable data collection.
47
-
6. Select "Save and continue" to save the integration.
42
+
1. In Kibana navigate to **Management** > **Integrations**.
43
+
2. In the search bar, type **Abnormal AI**.
44
+
3. Select the **Abnormal AI** integration and add it.
45
+
4. Add all the required integration configuration parameters, including Access Token, Interval, Initial Interval and Page Size to enable data collection.
46
+
5. Save the integration.
48
47
49
48
**Note**: By default, the URL is set to `https://api.abnormalplatform.com`. We have observed that Abnormal AI Base URL changes based on location so find your own base URL.
50
49
51
-
### Enabling enrichment for Threat events
50
+
### Enable enrichment for Threat events
52
51
53
52
Introduced in version 1.8.0, the Abnormal AI integration includes a new option called `Enable Attachments and Links enrichment` for the Threat data stream. When enabled, this feature enriches incoming threat events with additional details about any attachments and links included in the original message.
Copy file name to clipboardExpand all lines: packages/abnormal_security/docs/README.md
+11-12Lines changed: 11 additions & 12 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,7 +4,7 @@ Abnormal AI is a behavioral AI-based email security platform that learns the beh
4
4
5
5
The Abnormal AI integration collects data for AI Security Mailbox (formerly known as Abuse Mailbox), Audit, Case, and Threat logs using REST API.
6
6
7
-
## Data streams
7
+
## What data does this integration collect?
8
8
9
9
The Abnormal AI integration collects six types of logs:
10
10
@@ -20,13 +20,13 @@ The Abnormal AI integration collects six types of logs:
20
20
21
21
-**[Vendor Case](https://app.swaggerhub.com/apis-docs/abnormal-security/abx/1.4.3#/Vendors)** - Get details of Abnormal Vendor Cases.
22
22
23
-
## Requirements
23
+
## What do I need to use this integration?
24
24
25
25
Elastic Agent must be installed. For more details, check the Elastic Agent [installation instructions](docs-content://reference/fleet/install-elastic-agents.md).
26
26
27
-
## Setup
27
+
## How do I deploy this integration?
28
28
29
-
### To collect data from the Abnormal AI Client API:
29
+
### Collect data from the Abnormal AI Client API
30
30
31
31
#### Step 1: Go to Portal
32
32
* Visit the [Abnormal AI Portal](https://portal.abnormalsecurity.com/home/settings/integrations) and click on the `Abnormal REST API` setting.
@@ -37,18 +37,17 @@ Elastic Agent must be installed. For more details, check the Elastic Agent [inst
37
37
#### Step 3: IP allowlisting
38
38
* Abnormal AI requires you to restrict API access based on source IP. So in order for the integration to work, user needs to update the IP allowlisting to include the external source IP of the endpoint running the integration via Elastic Agent.
39
39
40
-
### Enabling the integration in Elastic:
40
+
### Enable the integration in Elastic
41
41
42
-
1. In Kibana navigate to Management > Integrations.
43
-
2. In "Search for integrations" top bar, search for `Abnormal AI`.
44
-
3. Select the "Abnormal AI" integration from the search results.
45
-
4. Select "Add Abnormal AI" to add the integration.
46
-
5. Add all the required integration configuration parameters, including Access Token, Interval, Initial Interval and Page Size to enable data collection.
47
-
6. Select "Save and continue" to save the integration.
42
+
1. In Kibana navigate to **Management** > **Integrations**.
43
+
2. In the search bar, type **Abnormal AI**.
44
+
3. Select the **Abnormal AI** integration and add it.
45
+
4. Add all the required integration configuration parameters, including Access Token, Interval, Initial Interval and Page Size to enable data collection.
46
+
5. Save the integration.
48
47
49
48
**Note**: By default, the URL is set to `https://api.abnormalplatform.com`. We have observed that Abnormal AI Base URL changes based on location so find your own base URL.
50
49
51
-
### Enabling enrichment for Threat events
50
+
### Enable enrichment for Threat events
52
51
53
52
Introduced in version 1.8.0, the Abnormal AI integration includes a new option called `Enable Attachments and Links enrichment` for the Threat data stream. When enabled, this feature enriches incoming threat events with additional details about any attachments and links included in the original message.
Copy file name to clipboardExpand all lines: packages/admin_by_request_epm/_dev/build/docs/README.md
+5-5Lines changed: 5 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -2,7 +2,7 @@
2
2
3
3
The Elastic integration for [Admin By Request EPM](https://www.adminbyrequest.com/en/endpoint-privilege-management) enables real-time monitoring and analysis of audit logging of privilege elevations, software installations and administrative actions through user portal. This integration collects, processes, and visualizes audit logs and events to enhance security posture, compliance, and operational efficiency.
4
4
5
-
## Data Streams
5
+
## What data does this integration collect?
6
6
7
7
-**`auditlog`**: Provides audit data that includes elevation requests, approvals, application installations, and scan results.
8
8
-[Auditlog](https://www.adminbyrequest.com/en/docs/auditlog-api) are records generated when user takes action such as installing a software, running an application with admin privileges, requesting for admin session, approval or denial of requests and scan results.
@@ -12,7 +12,7 @@ The Elastic integration for [Admin By Request EPM](https://www.adminbyrequest.co
12
12
-[Events](https://www.adminbyrequest.com/en/docs/events-api) are records that are generated on various actions done by users and administrators. These include group modifications, policy changes, security violations, and other administrative activities.
13
13
- This data stream leverages the Admin By Request EPM API [`/events`](https://www.adminbyrequest.com/en/docs/events-api) endpoint to retrieve data.
14
14
15
-
## Requirements
15
+
## What do I need to use this integration?
16
16
17
17
Elastic Agent must be installed. For more details, check the Elastic Agent [installation instructions](docs-content://reference/fleet/install-elastic-agents.md).
18
18
@@ -36,7 +36,7 @@ Auditlog documents can be found by setting the following filter:
36
36
37
37
**ECS Field Reference**
38
38
39
-
Please refer to the following [document](https://www.elastic.co/guide/en/ecs/current/ecs-field-reference.html) for detailed information on ECS fields.
39
+
Refer to the following [document](https://www.elastic.co/guide/en/ecs/current/ecs-field-reference.html) for detailed information on ECS fields.
40
40
41
41
The following non-ECS fields are used in events documents:
42
42
@@ -52,10 +52,10 @@ Event documents can be found by setting the following filter:
52
52
53
53
**ECS Field Reference**
54
54
55
-
Please refer to the following [document](https://www.elastic.co/guide/en/ecs/current/ecs-field-reference.html) for detailed information on ECS fields.
55
+
Refer to the following [document](https://www.elastic.co/guide/en/ecs/current/ecs-field-reference.html) for detailed information on ECS fields.
56
56
57
57
The following non-ECS fields are used in events documents:
58
58
59
59
{{fields "events"}}
60
60
61
-
Events Data stream has field `eventCode` which is a unique identifier for each event type. Please refer to the Event Codes table given on the [Events API documentation](https://www.adminbyrequest.com/en/docs/events-api) for more information on event codes.
61
+
Events Data stream has field `eventCode` which is a unique identifier for each event type. Refer to the Event Codes table given on the [Events API documentation](https://www.adminbyrequest.com/en/docs/events-api) for more information on event codes.
Copy file name to clipboardExpand all lines: packages/admin_by_request_epm/docs/README.md
+5-5Lines changed: 5 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -2,7 +2,7 @@
2
2
3
3
The Elastic integration for [Admin By Request EPM](https://www.adminbyrequest.com/en/endpoint-privilege-management) enables real-time monitoring and analysis of audit logging of privilege elevations, software installations and administrative actions through user portal. This integration collects, processes, and visualizes audit logs and events to enhance security posture, compliance, and operational efficiency.
4
4
5
-
## Data Streams
5
+
## What data does this integration collect?
6
6
7
7
-**`auditlog`**: Provides audit data that includes elevation requests, approvals, application installations, and scan results.
8
8
-[Auditlog](https://www.adminbyrequest.com/en/docs/auditlog-api) are records generated when user takes action such as installing a software, running an application with admin privileges, requesting for admin session, approval or denial of requests and scan results.
@@ -12,7 +12,7 @@ The Elastic integration for [Admin By Request EPM](https://www.adminbyrequest.co
12
12
-[Events](https://www.adminbyrequest.com/en/docs/events-api) are records that are generated on various actions done by users and administrators. These include group modifications, policy changes, security violations, and other administrative activities.
13
13
- This data stream leverages the Admin By Request EPM API [`/events`](https://www.adminbyrequest.com/en/docs/events-api) endpoint to retrieve data.
14
14
15
-
## Requirements
15
+
## What do I need to use this integration?
16
16
17
17
Elastic Agent must be installed. For more details, check the Elastic Agent [installation instructions](docs-content://reference/fleet/install-elastic-agents.md).
18
18
@@ -135,7 +135,7 @@ An example event for `auditlog` looks as following:
135
135
136
136
**ECS Field Reference**
137
137
138
-
Please refer to the following [document](https://www.elastic.co/guide/en/ecs/current/ecs-field-reference.html) for detailed information on ECS fields.
138
+
Refer to the following [document](https://www.elastic.co/guide/en/ecs/current/ecs-field-reference.html) for detailed information on ECS fields.
139
139
140
140
The following non-ECS fields are used in events documents:
141
141
@@ -310,7 +310,7 @@ An example event for `events` looks as following:
310
310
311
311
**ECS Field Reference**
312
312
313
-
Please refer to the following [document](https://www.elastic.co/guide/en/ecs/current/ecs-field-reference.html) for detailed information on ECS fields.
313
+
Refer to the following [document](https://www.elastic.co/guide/en/ecs/current/ecs-field-reference.html) for detailed information on ECS fields.
314
314
315
315
The following non-ECS fields are used in events documents:
316
316
@@ -360,4 +360,4 @@ The following non-ECS fields are used in events documents:
360
360
| input.type | Input type | keyword |
361
361
362
362
363
-
Events Data stream has field `eventCode` which is a unique identifier for each event type. Please refer to the Event Codes table given on the [Events API documentation](https://www.adminbyrequest.com/en/docs/events-api) for more information on event codes.
363
+
Events Data stream has field `eventCode` which is a unique identifier for each event type. Refer to the Event Codes table given on the [Events API documentation](https://www.adminbyrequest.com/en/docs/events-api) for more information on event codes.
Copy file name to clipboardExpand all lines: packages/admin_by_request_epm/manifest.yml
+2-6Lines changed: 2 additions & 6 deletions
Original file line number
Diff line number
Diff line change
@@ -1,7 +1,7 @@
1
1
format_version: 3.3.0
2
2
name: admin_by_request_epm
3
3
title: Admin By Request EPM
4
-
version: "1.0.0"
4
+
version: "1.1.0"
5
5
source:
6
6
license: "Elastic-2.0"
7
7
description: "Collect logs from Admin By Request EPM with Elastic Agent."
@@ -60,11 +60,7 @@ policy_templates:
60
60
required: false
61
61
show_user: false
62
62
description: >-
63
-
The request tracer logs requests and responses to the agent's local file-system for debugging configurations.
64
-
Enabling this request tracing compromises security and should only be used for debugging. Disabling the request
65
-
tracer will delete any stored traces.
66
-
See [documentation](https://www.elastic.co/guide/en/beats/filebeat/current/filebeat-input-cel.html#_resource_tracer_enable)
67
-
for details.
63
+
The request tracer logs requests and responses to the agent's local file-system for debugging configurations. Enabling this request tracing compromises security and should only be used for debugging. Disabling the request tracer will delete any stored traces. See [documentation](https://www.elastic.co/guide/en/beats/filebeat/current/filebeat-input-cel.html#_resource_tracer_enable) for details.
Copy file name to clipboardExpand all lines: packages/amazon_security_lake/_dev/build/docs/README.md
+29-32Lines changed: 29 additions & 32 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -12,58 +12,55 @@ The Amazon Security Lake integration can be used in two different modes to colle
12
12
13
13
This module follows the OCSF Schema Version **v1.1.0**.
14
14
15
-
## Data streams
15
+
## What data does this integration collect?
16
16
17
17
The Amazon Security Lake integration collects logs from both [Third-party services](https://docs.aws.amazon.com/security-lake/latest/userguide/integrations-third-party.html) and [AWS services](https://docs.aws.amazon.com/security-lake/latest/userguide/open-cybersecurity-schema-framework.html) in an event data stream.
18
18
19
-
### **NOTE**:
19
+
**NOTE**:
20
20
- The Amazon Security Lake integration supports events collected from [AWS services](https://docs.aws.amazon.com/security-lake/latest/userguide/internal-sources.html) and [third-party services](https://docs.aws.amazon.com/security-lake/latest/userguide/custom-sources.html).
21
21
22
22
- Due to the nature and structure of the OCSF schema, this integration has limitations on how deep the mappings run. Some important objects like 'Actor', 'User' and 'Product' have more fleshed-out mappings compared to others which get flattened after the initial 2-3 levels of nesting to keep them maintainable and stay within field mapping [limits](https://www.elastic.co/guide/en/elasticsearch/reference/current/mapping-settings-limit.html). This will evolve as needed.
23
23
24
-
## Requirements
24
+
## What do I need to use this integration?
25
25
26
-
- Elastic Agent must be installed.
27
-
- Elastic Agent is required to stream data from Amazon Security Lake and ship the data to Elastic, where the events will then be processed via the integration's ingest pipelines.
26
+
Elastic Agent must be installed. For more details, check the Elastic Agent [installation instructions](docs-content://reference/fleet/install-elastic-agents.md). Elastic Agent is required to stream data from Amazon Security Lake and ship the data to Elastic, where the events will then be processed via the integration's ingest pipelines.
28
27
29
28
## Setup
30
29
31
-
### To collect data from Amazon Security Lake follow the below steps:
32
-
33
-
1. To enable and start Amazon Security Lake, follow the steps mentioned here: [`https://docs.aws.amazon.com/security-lake/latest/userguide/getting-started.html`](https://docs.aws.amazon.com/security-lake/latest/userguide/getting-started.html).
34
-
2. After creating the data lake, follow the steps below to create data subscribers to consume data.
35
-
- Open the [Security Lake console](https://console.aws.amazon.com/securitylake/).
36
-
- By using the AWS Region selector in the upper-right corner of the page, select the Region where you want to create the subscriber.
37
-
- In the navigation pane, choose **Subscribers**.
38
-
- On the Subscribers page, choose **Create subscriber**.
39
-
- For **Subscriber details**, enter **Subscriber name** and an optional Description.
40
-
- For **Log and event sources**, choose which sources the subscriber is authorized to consume.
41
-
- For **Data access method**, choose **S3** to set up data access for the subscriber.
42
-
- For **Subscriber credentials**, provide the subscriber's **AWS account ID** and **external ID**.
43
-
- For **Notification details**, select **SQS queue**.
44
-
- Choose Create.
45
-
3. Above mentioned steps will create and provide the required details such as IAM roles/AWS role ID, external ID and queue URL to configure AWS Security Lake Integration.
46
-
47
-
### Enabling the integration in Elastic:
48
-
49
-
1. In Kibana go to Management > Integrations.
50
-
2. In "Search for integrations" search bar, type Amazon Security Lake.
51
-

52
-
3. Click on the "Amazon Security Lake" integration from the search results.
53
-
4. Click on the Add Amazon Security Lake Integration button to add the integration.
54
-

55
-
5. By default collect logs via S3 Bucket toggle will be off and collect logs for AWS SQS.
30
+
### Collect data from Amazon Security Lake
31
+
32
+
To enable and start Amazon Security Lake, refer to the [AWS getting started](https://docs.aws.amazon.com/security-lake/latest/userguide/getting-started.html).
33
+
34
+
To create and provide the required details such as IAM roles/AWS role ID, external ID and queue URL to configure AWS Security Lake Integration, follow these steps:
35
+
36
+
1. Open the [Security Lake console](https://console.aws.amazon.com/securitylake/).
37
+
2. By using the AWS Region selector in the upper-right corner of the page, select the region where you want to create the subscriber.
38
+
3. In the navigation pane, choose **Subscribers**.
39
+
4. On the Subscribers page, choose **Create subscriber**.
40
+
5. In **Subscriber details**, enter **Subscriber name** and an optional description.
41
+
6. In **Log and event sources**, choose which sources the subscriber is authorized to consume.
42
+
7. In **Data access method**, choose **S3** to set up data access for the subscriber.
43
+
8. For **Subscriber credentials**, provide the subscriber's **AWS account ID** and **external ID**.
44
+
9. For **Notification details**, select **SQS queue**.
45
+
10. Click **Create**.
46
+
47
+
### Enable the integration in Elastic
48
+
49
+
1. In Kibana navigate to **Management** > **Integrations**.
50
+
2. In the search bar, type **Amazon Security Lake**.
51
+
3. Select the **Amazon Security Lake** integration and add it.
52
+
4. By default collect logs via S3 Bucket toggle will be off and collect logs for AWS SQS.
56
53
- queue url
57
54

58
55
- collect logs via S3 Bucket toggled off
59
56
- role ARN
60
57
- external id
61
58

62
-
63
-
6. If you want to collect logs via AWS S3, then you have to put the following details:
59
+
5. If you want to collect logs via AWS S3, then you have to put the following details:
0 commit comments