External Log Settings

Prev Next

The LogForwarder allows you to specify which appliances they will service and then provides numerous forwarding options allowing the audit log records from the Appgate SDP Collective to be exported.

First you need to select the input:

Log Collection Source

Choose the Sites that will send logs to this LogForwarder. By default, all Sites will be logged. There is also an option to collect logs from appliances that have never been added to a Site.

Then you need to chose an output:

AWS Kinesis Forwarding

Use AWS Kinesis streaming data platform to handle the logs.

Type

Select AWS' real-time data streaming service or the Firehose data capture service.

Stream Name

Enter the stream name.

Batch Size

The number of records to send to the function in each batch, up to 10,000.

Number of Partition Keys

Add one or more partition keys to determine which shards will handle the data.

Filter

Optional. Filter these log records using a boolean expression using JMESPath query language. Refer to LogForward filtering

AWS Access Method

Choose AWS API access method. If using the instance's IAM role (created with the instance profile), you will need a valid access Policy created for it to work.

Access Key ID

Enter your AWS Access Key ID for the IAM Role.

Secret Access Key

Enter your AWS Secret Access Key.

Region

Enter the region code for the location of the Elasticsearch cluster.

Azure Monitor Forwarding

Forward logs to Azure Monitor via a Data Collection Endpoint.

Appgate SDP supports sending logs to Azure Monitor via a Data Collection Endpoint (DCE) and Data Collection Rule (DCR). The logs are sent to the endpoint and the Data Collection Rule defines how the logs are inserted into a target table. The structure of the target table doesn't necessarily need to match the structure of the JSON log records your LogForwarder sends because the DCR can include a transformation that converts the JSON logs to a format matching table. Setting up the DCR transformation requires a sample log that your transformation can be tested on. A sample log is available here.

If you are unsure what log fields to use, here is a very basic DCR transformation to get you started with receiving logs:

source
| extend event_type = log.event_type
| extend TimeGenerated = todatetime(timestamp)
| project-away ['date']

Application ID

Enter the application ID that's assigned to your app. You can find this information in the Portal where you registered your Azure app. Example: 9528e71d-b05b-4608-aa6d-fb726b24121e

Client Secret

Enter the client secret generated for your app in the Azure app registration portal. Example: GkG8Q~qWer3Yd6D04HCur-ZjNDyRyJoyPlIAtaB1

Token Request URL

Enter the URL where the client secret should be sent in order to obtain a bearer token. Example: https://login.microsoftonline.com/$tenantId/oauth2/v2.0/token. Where $tenantId should be replaced with the tenant id for your app registration.

Endpoint URL

The DCE Endpoint URI for Azure Monitor Monitor that handles the log data. See https://learn.microsoft.com/en-us/azure/azure-monitor/logs/logs-ingestion-api-overview.

Format: $DceURI/dataCollectionRules/$DcrImmutableId/streams/Custom-$Table"+"?api-version=2023-01-01.

  • Replace $DceURI with the Data Collection Endpoint URI (might also be called Logs Ingestion in the Azure portal).

  • Replace $DcrImmutableId with the DCR immutable ID.

  • Replace $Table with the table name.

Try newer api versions if you run into issues. See https://learn.microsoft.com/en-us/azure/azure-monitor/logs/logs-ingestion-api-overview and https://learn.microsoft.com/en-us/azure/azure-monitor/logs/tutorial-logs-ingestion-portal for more information about the fields.

Scope

Specifies which bearer token permissions the LogForwarder will ask for when it uses the Token URL to request a bearer token. The default will work for most use cases unless you're using 'govcloud'.

Example: https://monitor.azure.com//.default

Coralogix Forwarding

Forward logs to a Coralogix HTTPS URL.

URL

Paste the Coralogix HTTPS URL.

Private Key

Paste the Coralogix Private Key.

UUID

Enter the UUID.

Application Name

Enter the name of the application.

Subsystem

Enter the name of the subsystem.

Datadog Forwarding

Forward logs to a Datadog HTTPS source URL.

Site

Enter the Datadog site to be used.

API Key

Enter the API key for the Datadog site.

Source

This is the field used by Datadog to identify where logs come from and how they will be handled

Tags

Enter comma separated values to be used for the logs.

Elasticsearch/OpenSearch Forwarding

Configure an Elasticsearch instance as the log destination.

URL

URL of the Elasticsearch instance being configured.

Version

Select the API version to suit the ES version that is being used.

Log Retention

Set the log retention period.

Retention Period

Set how many days of audit logs will be kept in the Elasticsearch instance database.

Access Method

Choose AWS API access method. If using the instance's IAM role (created with the instance profile), you will need a valid access Policy created for it to work.

NOTE

The API key service is required when using Elastic Cloud Serverless.

Access Key ID

Enter your AWS Access Key ID for the IAM Role.

Secret Access Key

Enter your AWS Secret Access Key.

Region

Enter the region code for the location of the Elasticsearch cluster.

Authentication

Appgate SDP supports a number of the authentication services provided in some versions of Elasticsearch. See token-authentication and basic-authentication for details.

Type

Select the type of authentication service required.

Secret

Enter the secret to be used when authenticating to ES.

NOTE

For Basic Authentication, the Secret should be the base64 encoded (USERNAME:PASSWORD). For API Key Service the Secret should be the base64 encoded (API key ID:API key).

Falcon LogScale Forwarding

Forward logs to a Falcon LogScale instance. See https://library.humio.com/falcon-logscale/log-shippers-hec.html for more information.

Event Collector URL

The URL of the HTTP Event Collector (HEC) receiving the logs.

Token

Paste the Ingest Token - a unique string that identifies and allows you to send data to ingest repository.

Index

Optional name of the ingest repository. In public-facing API's this must — if present — be the same as used in the ingest token.

Source Type

Optional field which is translated to #type inside LogScale

Source

Optional field which is translated to the @source field in LogScale.

Splunk Forwarding

Forward logs in the RAW format to a Splunk HTTP Event Collector (HEC).

Token

Paste the Splunk HEC authentication token.

URL

Enter the URL of the Splunk Event Collector for raw events, using HTTPS if enabled.

Sumo Logic Forwarding

Forward logs to a Sumo Logic HTTPS source URL.

URL

Paste the HTTPS source URL copied from Sumo Logic.

TCP Forwarding

Use TCP to connect to a log destination.

Name

Enter a name for the TCP client being configured.

Hostname or IP Address

Hostname or IP address of the external log server.

Port

Port number of the external log server.

Format

Select the format to be used.

Encryption Method

Choose TLS if the logs should be sent securely.

NOTE

The TLS connection relies on having the appropriate certificate uploaded in System>Trusted Certificates.

Filter

Optional. Filter these log records using a boolean expression using JMESPath query language.

Refer to LogForward filtering