Ingestion of AWS CloudWatch data to Microsoft Sentinel using S3 connector
Hello Guys,
I hope you all are doing well. I already posted this as question but i wanted to start discussion since perhaps some of you maybe had better experience.
I want to integrate CloudWatch logs to S3 bucket using Lambda function and then to send those logs to Microsoft Sentinel.
As per Microsoft documentation provided: Ingest CloudWatch logs to Microsoft Sentinel – create a Lambda function to send CloudWatch events to S3 bucket | Microsoft Learn
Connect Microsoft Sentinel to Amazon Web Services to ingest AWS service log data | Microsoft Learn
there is a way to do this BUT, first link is from last year and when i try to ingest logs on way provided there is always an error in query “Unable to import module ‘lambda_function’: No module named ‘pandas’ ; Also, as i understood, Lambda Python script gives you the specified time range you need to set in order to export those logs – i want that logs be exported every day each few minutes and synchronized into Microsoft Sentinel.
(Lambda function .py script was run in Python 3.9 as mentioned on Microsoft documentation, also all of the resources used were from github solution mentioned in Microsoft documents).
When trying to run automation script provided i got created S3 bucket IAM role and SQS in AWS which is fine, but even then, the connector on AWS is still grey without any changes.
I even tried to change IAM role in AWS by adding Lambda permissions and using it for Lambda queries i found on internet, created CloudWatch event bridge rule for it, but even though i can see some of .gz data ingested to S3 bucket, there is no data sent to Microsoft Sentinel.
So is there anyone here that can describe full process needed to be preformed in order to ingest logs from CloudWatch to Sentinel successfully and maybe are there some people that had experience with this process – what are the things i need to take care of / maybe log ingestion data (to be cost effective) etc..
I want to mention that i am preforming this in my testing environment.
Since automation script in powershell gives you capability to automatically create aws resources necessary, i tried this on test environment:
Downloaded AWS CLI, ran aws config, provided keys necessary with default location of my resources.
2.Run Automation Script from powershell as documentation mentioned, filled out all fields necessary.
2.1 Automation script created:
2.1.1 S3 Bucket with Access policy:
allow IAM role to read S3 bucket and s3GetObject from s3 bucketAllow CloudWatch to upload objects to bucket with S3PutObject, AWS Cloud Watch ACLCheck Allowed from CloudWatch to S3 bucket.
2.1.2 Notification event for S3 bucket to send all logs from specified S3 bucket to SQS for objects with suffix .gz (Later edited this manually and added all event types to make sure events are sent)
2.1.3 SQS Queue with Access Policy – Allow S3 bucket to SendMessage to SQS service.
2.1.4 IAM user with Sentinel Workspace ID and Sentinel RoleID
Since this was deployed via Automation script, in order to send logs with CloudWatch it is necessary to configure Lambda function. Since script itself does not create these resources i have created it manually:
Added IAM role assignments for Permission policies:S3 Full Access, AWS Lambda Execute, CloudWatchFullAccess, CloudWatchLogsFullAccess (even later i added: CloudWatchFullAccessV2, S3ObjectLambdaExecutionRolePolicy to try it out)
1.2 Added lambda.amazonaws.com in trust relationship policy so i can use this role for Lambda execution.
Created a CloudWatch log group and log stream – created log group per subscription filter for lambda function
3.Created Lambda function as per Microsoft documentation – tried newest article
https://techcommunity.microsoft.com/t5/microsoft-sentinel-blog/enhance-the-ingestion-of-aws-cloudwatch-logs-into-microsoft/ba-p/4100565
(Choose Lambda Python 3.12 , used existing role created above);
(Took CloudWatchLambdaFunction_V2.py and there is an issue with pandas module, i managed to overcome this using the document:
https://medium.com/@shandilya90apoorva/aws-cloud-pipeline-step-by-step-guide-241aaf059918
but even then i get error:
Response { “errorMessage”: “Unable to import module ‘lambda_function’: Error importing numpy: you should not try to import numpy fromn its source directory; please exit the numpy source tree, and relaunchn your python interpreter from there.”, “errorType”: “Runtime.ImportModuleError”, “requestId”: “”, “stackTrace”: [] }
Anyway this is what i tried and i eventually get to same error regarding lambda function provided from Microsoft.
Hello Guys, I hope you all are doing well. I already posted this as question but i wanted to start discussion since perhaps some of you maybe had better experience.I want to integrate CloudWatch logs to S3 bucket using Lambda function and then to send those logs to Microsoft Sentinel.As per Microsoft documentation provided: Ingest CloudWatch logs to Microsoft Sentinel – create a Lambda function to send CloudWatch events to S3 bucket | Microsoft LearnConnect Microsoft Sentinel to Amazon Web Services to ingest AWS service log data | Microsoft Learnthere is a way to do this BUT, first link is from last year and when i try to ingest logs on way provided there is always an error in query “Unable to import module ‘lambda_function’: No module named ‘pandas’ ; Also, as i understood, Lambda Python script gives you the specified time range you need to set in order to export those logs – i want that logs be exported every day each few minutes and synchronized into Microsoft Sentinel.(Lambda function .py script was run in Python 3.9 as mentioned on Microsoft documentation, also all of the resources used were from github solution mentioned in Microsoft documents).When trying to run automation script provided i got created S3 bucket IAM role and SQS in AWS which is fine, but even then, the connector on AWS is still grey without any changes.I even tried to change IAM role in AWS by adding Lambda permissions and using it for Lambda queries i found on internet, created CloudWatch event bridge rule for it, but even though i can see some of .gz data ingested to S3 bucket, there is no data sent to Microsoft Sentinel.So is there anyone here that can describe full process needed to be preformed in order to ingest logs from CloudWatch to Sentinel successfully and maybe are there some people that had experience with this process – what are the things i need to take care of / maybe log ingestion data (to be cost effective) etc..I want to mention that i am preforming this in my testing environment.Since automation script in powershell gives you capability to automatically create aws resources necessary, i tried this on test environment:Downloaded AWS CLI, ran aws config, provided keys necessary with default location of my resources.2.Run Automation Script from powershell as documentation mentioned, filled out all fields necessary.2.1 Automation script created:2.1.1 S3 Bucket with Access policy:allow IAM role to read S3 bucket and s3GetObject from s3 bucketAllow CloudWatch to upload objects to bucket with S3PutObject, AWS Cloud Watch ACLCheck Allowed from CloudWatch to S3 bucket.2.1.2 Notification event for S3 bucket to send all logs from specified S3 bucket to SQS for objects with suffix .gz (Later edited this manually and added all event types to make sure events are sent)2.1.3 SQS Queue with Access Policy – Allow S3 bucket to SendMessage to SQS service.2.1.4 IAM user with Sentinel Workspace ID and Sentinel RoleIDSince this was deployed via Automation script, in order to send logs with CloudWatch it is necessary to configure Lambda function. Since script itself does not create these resources i have created it manually:Added IAM role assignments for Permission policies:S3 Full Access, AWS Lambda Execute, CloudWatchFullAccess, CloudWatchLogsFullAccess (even later i added: CloudWatchFullAccessV2, S3ObjectLambdaExecutionRolePolicy to try it out)1.2 Added lambda.amazonaws.com in trust relationship policy so i can use this role for Lambda execution.Created a CloudWatch log group and log stream – created log group per subscription filter for lambda function3.Created Lambda function as per Microsoft documentation – tried newest articlehttps://techcommunity.microsoft.com/t5/microsoft-sentinel-blog/enhance-the-ingestion-of-aws-cloudwatch-logs-into-microsoft/ba-p/4100565(Choose Lambda Python 3.12 , used existing role created above);(Took CloudWatchLambdaFunction_V2.py and there is an issue with pandas module, i managed to overcome this using the document:https://medium.com/@shandilya90apoorva/aws-cloud-pipeline-step-by-step-guide-241aaf059918but even then i get error:Response { “errorMessage”: “Unable to import module ‘lambda_function’: Error importing numpy: you should not try to import numpy fromn its source directory; please exit the numpy source tree, and relaunchn your python interpreter from there.”, “errorType”: “Runtime.ImportModuleError”, “requestId”: “”, “stackTrace”: [] }Anyway this is what i tried and i eventually get to same error regarding lambda function provided from Microsoft. Read More