Generate safety insights from Amazon Safety Lake knowledge utilizing Amazon OpenSearch Ingestion


Amazon Safety Lake centralizes entry and administration of your safety knowledge by aggregating safety occasion logs from AWS environments, different cloud suppliers, on premise infrastructure, and different software program as a service (SaaS) options. By changing logs and occasions utilizing Open Cybersecurity Schema Framework, an open normal for storing safety occasions in a standard and shareable format, Safety Lake optimizes and normalizes your safety knowledge for evaluation utilizing your most well-liked analytics device.

Amazon OpenSearch Service continues to be a device of alternative by many enterprises for looking and analyzing giant quantity of safety knowledge. On this put up, we present you the right way to ingest and question Amazon Safety Lake knowledge with Amazon OpenSearch Ingestion, a serverless, absolutely managed knowledge collector with configurable ingestion pipelines. Utilizing OpenSearch Ingestion to ingest knowledge into your OpenSearch Service cluster, you’ll be able to derive insights faster for time delicate safety investigations. You possibly can reply swiftly to safety incidents, serving to you shield your corporation important knowledge and techniques.

Resolution overview

The next structure outlines the movement of information from Safety Lake to OpenSearch Service.

The workflow accommodates the next steps:

  1. Safety Lake persists OCSF schema normalized knowledge in an Amazon Easy Storage Service (Amazon S3) bucket decided by the administrator.
  2. Safety Lake notifies subscribers by means of the chosen subscription technique, on this case Amazon Easy Queue Service (Amazon SQS).
  3. OpenSearch Ingestion registers as a subscriber to get the mandatory context data.
  4. OpenSearch Ingestion reads Parquet formatted safety knowledge from the Safety Lake managed Amazon S3 bucket and transforms the safety logs into JSON paperwork.
  5. OpenSearch Ingestion ingests this OCSF compliant knowledge into OpenSearch Service.
  6. Obtain and import offered dashboards to research and acquire fast insights into the safety knowledge.

OpenSearch Ingestion supplies a serverless ingestion framework to simply ingest Safety Lake knowledge into OpenSearch Service with only a few clicks.


Full the next prerequisite steps:

  1. Create an Amazon OpenSearch Service area. For directions, confer with Creating and managing Amazon OpenSearch Service domains.
  2. You have to have entry to the AWS account through which you want to arrange this answer.

Arrange Amazon Safety Lake

On this part, we current the steps to arrange Amazon Safety Lake, which incorporates enabling the service and making a subscriber.

Allow Amazon Safety Lake

Determine the account through which you wish to activate Amazon Safety Lake. Word that for accounts which might be a part of organizations, you must designate a delegated Safety Lake administrator out of your administration account. For directions, confer with Managing a number of accounts with AWS Organizations.

  1. Check in to the AWS Administration Console utilizing the credentials of the delegated account.
  2. On the Amazon Safety Lake console, select your most well-liked Area, then select Get began.

Amazon Safety Lake collects log and occasion knowledge from a wide range of sources and throughout your AWS accounts and Areas.

Now you’re able to allow Amazon Safety Lake.

  1. You possibly can both choose All log and occasion sources or select particular logs by choosing Particular log and occasion sources.
  2. Knowledge is ingested from all Areas. The advice is to pick All supported areas so actions are logged for accounts that you just may not regularly use as properly. Nonetheless, you even have the choice to pick Particular Areas.
  3. For Choose accounts, you’ll be able to choose the accounts through which you need Amazon Safety Lake enabled. For this put up, we choose All accounts.

  1. You’re prompted to both create a brand new AWS Id and Entry Administration (IAM) function or use an present IAM function. This offers required permissions to Amazon Safety Lake to gather the logs and occasions. Select the choice acceptable on your state of affairs.
  2. Select Subsequent.
  3. Optionally, specify the Amazon S3 storage class for the info in Amazon Safety Lake. For extra data, confer with Lifecycle administration in Safety Lake.
  4. Select Subsequent.
  5. Evaluate the small print and create the info lake.

Create an Amazon Safety Lake subscriber

To entry and devour knowledge in your Safety Lake managed Amazon S3 buckets, you will need to arrange a subscriber.

Full the next steps to create your subscriber:

  1. On the Amazon Safety Lake console, select Abstract within the navigation pane.

Right here, you’ll be able to see the variety of Areas chosen.

  1. Select Create subscriber.

A subscriber consumes logs and occasions from Amazon Safety Lake. On this case, the subscriber is OpenSearch Ingestion, which consumes safety knowledge and ingests it into OpenSearch Service.

  1. For Subscriber title, enter OpenSearchIngestion.
  2. Enter an outline.
  3. Area is robotically populated based mostly on the present chosen Area.
  4. For Log and occasion sources, choose whether or not the subscriber is permitted to devour all log and occasion sources or particular log and occasion sources.
  5. For Knowledge entry technique, choose S3.
  6. For Subscriber credentials, enter the subscriber’s <AWS account ID> and OpenSearchIngestion-<AWS account ID>.
  7. For Notification particulars, choose SQS queue.

This prompts Amazon Safety Lake to create an SQS queue that the subscriber can ballot for object notifications.

  1. Select Create.

Set up templates and dashboards for Amazon Safety Lake knowledge

Your subscriber for OpenSearch Ingestion is now prepared. Earlier than you configure OpenSearch Ingestion to course of the safety knowledge, let’s configure an OpenSearch sink (vacation spot to put in writing knowledge) with index templates and dashboards.

Index templates are predefined mappings for safety knowledge that selects the proper OpenSearch discipline varieties for corresponding Open Cybersecurity Schema Framework (OCSF) schema definition. As well as, index templates additionally comprise index-specific settings for a specific index patterns. OCSF classifies safety knowledge into completely different classes resembling system exercise, findings, identification and entry administration, community exercise, software exercise and discovery.

Amazon Safety Lake publishes occasions from 4 completely different AWS sources: AWS CloudTrail with subsets for AWS Lambda and Amazon Easy Storage Service (Amazon S3), Amazon Digital Non-public Cloud(Amazon VPC) Circulate Logs, Amazon Route 53, and AWS Safety Hub. The next desk particulars the occasion sources and their corresponding OCSF classes and OpenSearch index templates.

Amazon Safety Lake Supply OCSF Class ID OpenSearch Index Sample
CloudTrail (Lambda and Amazon S3 API subsets) 3005 ocsf-3005*
VPC Circulate Logs 4001 ocsf-4001*
Route 53 4003 ocsf-4003*
Safety Hub 2001 ocsf-2001*

To simply establish OpenSearch indices containing Safety Lake knowledge, we suggest following a structured index naming sample that features the log class and its OCSF outlined class within the title of the index. An instance is offered under


Full the next steps to put in the index templates and dashboards on your knowledge:

  1. Obtain the and recordsdata and unzip them in your native machine.

Part templates are composable modules with settings, mappings, and aliases that may be shared and utilized by index templates.

  1. Add the part templates earlier than the index templates. For instance, the next Linux command line reveals the right way to use the OpenSearch _component_template API to add to your OpenSearch Service area (change the area URL and the credentials to acceptable values on your setting):
    ls component_templates | awk -F'_body' '{print $1}' | xargs -I{} curl  -u adminuser:password -X PUT -H 'Content material-Sort: software/json' -d @component_templates/{}_body.json{}

  2. As soon as the part templates are efficiently uploaded, proceed to add the index templates:
    ls index_templates | awk -F'_body' '{print $1}' | xargs -I{} curl  -uadminuser:password -X PUT -H 'Content material-Sort: software/json' -d @index_templates/{}_body.json{}

  3. Confirm whether or not the index templates and part templates are uploaded efficiently, by navigating to OpenSearch Dashboards, select the hamburger menu, then select Index Administration.

  1. Within the navigation pane, select Templates to see all of the OCSF index templates.

  1. Select Part templates to confirm the OCSF part templates.

  1. After efficiently importing the templates, obtain the pre-built dashboards and different parts required to visualise the Safety Lake knowledge in OpenSearch indices.
  2. To add these to OpenSearch Dashboards, select the hamburger menu, and underneath Administration, select Stack Administration.
  3. Within the navigation pane, select Saved Objects.

  1. Select Import.

  1. Select Import, navigate to the downloaded file, then select Import.

  1. Verify the dashboard objects are imported accurately, then select Executed.

All the mandatory index and part templates, index patterns, visualizations, and dashboards are actually efficiently put in.

Configure OpenSearch Ingestion

Every OpenSearch Ingestion pipeline may have a single knowledge supply with a number of sub-pipelines, processors, and sink. In our answer, Safety Lake managed Amazon S3 is the supply and your OpenSearch Service cluster is the sink. Earlier than organising OpenSearch Ingestion, it’s worthwhile to create the next IAM roles and arrange the required permissions:

  • Pipeline function – Defines permissions to learn from Amazon Safety Lake and write to the OpenSearch Service area
  • Administration function – Defines permission to permit the person to create, replace, delete, validate the pipeline and carry out different administration operations

The next determine reveals the permissions and roles you want and the way they work together with the answer providers.

Earlier than you create an OpenSearch Ingestion pipeline, the principal or the person creating the pipeline will need to have permissions to carry out administration actions on a pipeline (create, replace, listing, and validate). Moreover, the principal will need to have permission to cross the pipeline function to OpenSearch Ingestion. In case you are performing these operations as a non-administrator, add the next permissions to the person creating the pipelines:

	"Model": "2012-10-17",
	"Assertion": [
			"Effect": "Allow",
			"Resource": "*",
			"Action": [
			"_comment": "Exchange {your-account-id} along with your AWS account ID",
			"Useful resource": [
			"Impact": "Enable",
			"Motion": [

Configure a learn coverage for the pipeline function

Safety Lake subscribers solely have entry to the supply knowledge within the Area you chose whenever you created the subscriber. To offer a subscriber entry to knowledge from a number of Areas, confer with Managing a number of Areas. To create a coverage for learn permissions, you want the title of the Amazon S3 bucket and the Amazon SQS queue created by Safety Lake.

Full the next steps to configure a learn coverage for the pipeline function:

  1. On the Safety Lake console, select Areas within the navigation pane.
  2. Select the S3 location equivalent to the Area of the subscriber you created.

  1. Make a remark of this Amazon S3 bucket title.

  1. Select Subscribers within the navigation pane.
  2. Select the subscriber OpenSearchIngestion that you just created earlier.

  1. Be aware of the Amazon SQS queue ARN underneath Subscription endpoint.

  1. On the IAM console, select Insurance policies within the navigation pane.
  2. Select Create coverage.
  3. Within the Specify permissions part, select JSON to open the coverage editor.
  4. Take away the default coverage and enter the next code (change the S3 bucket and SQS queue ARN with the corresponding values):
    	"Model": "2012-10-17",
    	"Assertion": [
    			"Sid": "ReadFromS3",
    			"Effect": "Allow",
    			"Action": "s3:GetObject",
    			"Resource": "arn:aws:s3:::{bucket-name}/*"
    			"Sid": "ReceiveAndDeleteSqsMessages",
    			"Effect": "Allow",
    			"Action": [
    			"_comment": "Exchange {your-account-id} along with your AWS account ID",
    			"Useful resource": "arn:aws:sqs:{area}:{your-account-id}:{sqs-queue-name}"

  5. Select Subsequent.
  6. For coverage title, enter read-from-securitylake.
  7. Select Create coverage.

You have got efficiently created the coverage to learn knowledge from Safety Lake and obtain and delete messages from the Amazon SQS queue.

The whole course of is proven under.

Configure a write coverage for the pipeline function

We suggest utilizing fine-grained entry management (FGAC) with OpenSearch Service. Whenever you use FGAC, you don’t have to make use of a website entry coverage; you’ll be able to skip the remainder of this part and proceed to creating your pipeline function with the mandatory permissions. For those who use a website entry coverage, it’s worthwhile to create a second coverage (for this put up, we name it write-to-opensearch) as an added step to the steps within the earlier part. Use the next coverage code:

	"Model": "2012-10-17",
	"Assertion": [
			"Effect": "Allow",
			"Action": "es:DescribeDomain",
			"Resource": "arn:aws:es:*:{your-account-id}:domain/*"
			"Effect": "Allow",
			"Action": "es:ESHttp*",
			"Resource": "arn:aws:es:*:{your-account-id}:domain/{domain-name}/*"

If the configured function has permissions to entry Amazon S3 and Amazon SQS throughout accounts, OpenSearch Ingestion can ingest knowledge throughout accounts.

Create the pipeline function with mandatory permissions

Now that you’ve got created the insurance policies, you’ll be able to create the pipeline function. Full the next steps:

  1. On the IAM console, select Roles within the navigation pane.
  2. Select Create function.
  3. For Use circumstances for different AWS providers, choose OpenSearch Ingestion pipelines.
  4. Select Subsequent.
  5. Seek for and choose the coverage read-from-securitylake.
  6. Seek for and choose the coverage write-to-opensearch (for those who’re utilizing a website entry coverage).
  7. Select Subsequent.
  8. For Position Title, enter pipeline-role.
  9. Select Create.

Preserve word of the function title; you’ll be utilizing it whereas configuring opensearch-pipeline.

Now you’ll be able to map the pipeline function to an OpenSearch backend function for those who’re utilizing FGAC. You possibly can map the ingestion function to considered one of predefined roles or create your personal with mandatory permissions. For instance, all_access is a built-in function that grants administrative permission to all OpenSearch features. When deploying to a manufacturing setting, be certain that to make use of a job with simply sufficient permissions to put in writing to your Amazon OpenSearch Service area.

Create the OpenSearch Ingestion pipeline

On this part, you utilize the pipeline function you created to create an OpenSearch Ingestion pipeline. Full the next steps:

  1. On the OpenSearch Service console, select OpenSearch Ingestion within the navigation pane.
  2. Select Create pipeline.
  3. For Pipeline title, enter a reputation, resembling security-lake-osi.
  4. Within the Pipeline configuration part, select Configuration blueprints and select AWS-SecurityLakeS3ParquetOCSFPipeline.

  1. Beneath supply, replace the next data:
    1. Replace the queue_url within the sqs part. (That is the SQS queue that Amazon Safety Lake created whenever you created a subscriber. To get the URL, navigate to the Amazon SQS console and search for the queue ARN created with the format AmazonSecurityLake-abcde-Most important-Queue.)
    2. Enter the Area to make use of for aws credentials.

  1. Beneath sink, replace the next data:
    1. Exchange the hosts worth within the OpenSearch part with the Amazon OpenSearch Service area endpoint.
    2. For sts_role_arn, enter the ARN of pipeline-role.
    3. Set area as us-east-1.
    4. For index, enter the index title that was outlined within the template created within the earlier part ("ocsf-cuid-${/class_uid}-${/metadata/product/title}-${/class_name}-%{yyyy.MM.dd}").
  2. Select Validate pipeline to confirm the pipeline configuration.

If the configuration is legitimate, a profitable validation message seems; now you can proceed to the subsequent steps.

  1. Beneath Community, choose Public for this put up. Our suggestion is to pick VPC entry for an inherent layer of safety.
  2. Select Subsequent.
  3. Evaluate the small print and create the pipeline.

When the pipeline is lively, you must see the safety knowledge ingested into your Amazon OpenSearch Service area.

Visualize the safety knowledge

After OpenSearch Ingestion begins writing your knowledge into your OpenSearch Service area, you must be capable to visualize the info utilizing the pre-built dashboards you imported earlier. Navigate to dashboards and select any one of many put in dashboards.

For instance, selecting DNS Exercise gives you dashboards of all DNS exercise revealed in Amazon Safety Lake.

This dashboard reveals the highest DNS queries by account and hostname. It additionally reveals the variety of queries per account. OpenSearch Dashboards are versatile; you’ll be able to add, delete, or replace any of those visualizations to fit your group and enterprise wants.

Clear up

To keep away from undesirable prices, delete the OpenSearch Service area and OpenSearch Ingestion pipeline, and disable Amazon Safety Lake.


On this put up, you efficiently configured Amazon Safety Lake to ship safety knowledge from completely different sources to OpenSearch Service by means of serverless OpenSearch Ingestion. You put in pre-built templates and dashboards to shortly get insights from the safety knowledge. Seek advice from Amazon OpenSearch Ingestion to seek out further sources from which you’ll be able to ingest knowledge. For extra use circumstances, confer with Use circumstances for Amazon OpenSearch Ingestion.

In regards to the authors

Muthu Pitchaimani is a Search Specialist with Amazon OpenSearch Service. He builds large-scale search functions and options. Muthu is within the subjects of networking and safety, and relies out of Austin, Texas.

Aish Gunasekar is a Specialist Options architect with a give attention to Amazon OpenSearch Service. Her ardour at AWS is to assist clients design extremely scalable architectures and assist them of their cloud adoption journey. Exterior of labor, she enjoys climbing and baking.

Jimish Shah is a Senior Product Supervisor at AWS with 15+ years of expertise bringing merchandise to market in log analytics, cybersecurity, and IP video streaming. He’s obsessed with launching merchandise that provide pleasant buyer experiences, and resolve complicated buyer issues. In his free time, he enjoys exploring cafes, climbing, and taking lengthy walks.