CloudWatch Logs
đŠī¸ AWS CloudWatch Logs â Overview with Examples¶
CloudWatch Logs helps you collect, monitor, search, and analyze logs from AWS services and custom sources. Below are the core components explained with examples:
â 1. Log Groups¶
- Definition: A container for multiple log streams from similar components or services.
- Example: Used to group all logs from ECS containers running the payment service.
â 2. Log Streams¶
- Definition: A sequence of log events from a single source (e.g., Lambda function invocation, EC2 instance).
- Example: Logs from a single ECS task or EC2 instance.
â 3. Log Events¶
- Definition: Individual log entries with a timestamp and message.
- Example:
â 4. Source¶
- Definition: The origin of the logs.
- Examples of sources:
- AWS Lambda
- EC2 with CloudWatch Agent
- ECS/Containers
- API Gateway
- Example: Generates logs per invocation.
â 5. Log Insights¶
- Definition: A powerful query engine to analyze logs using SQL-like syntax.
- Example Query: Filters and displays the last 5 error logs for faster debugging
â 6. S3 Export¶
- Definition: Export logs from CloudWatch to S3 for long-term storage or analysis.
- Example: Export logs from
/ecs/payment-serviceto: Can later be analyzed using Athena or Glue
â 7. Log Subscriptions¶
- Definition: Real-time forwarding of logs to other AWS services.
- Targets:
- Lambda
- Kinesis Data Stream
- Kinesis Firehose
- Example: Set up a subscription filter to forward logs to a Lambda function: Sends error logs as Slack alerts.
â 8. Live Tail¶
- Definition: View logs in real time, similar to
tail -f. - Example: During deployment of
user-service, use Live Tail in the console to monitor:
â 9. Log Aggregation (Multi-Account & Multi-Region)¶
- Definition: Centralize logs from multiple AWS accounts and regions.
- Use Case Setup:
- You have dev, staging, and prod accounts in
us-east-1andus-west-2. - Logs are sent via subscription filters to a central Kinesis Firehose in a logging account.
- Firehose delivers to:
- Use Athena or OpenSearch to analyze logs from all sources.
- You have dev, staging, and prod accounts in
đ Full CloudWatch Logs Workflow with Console + CLI¶
1ī¸âŖ Create a Log Group¶
đĨī¸ Console:
-
Go to CloudWatch > Logs > Log groups
-
Click "Create log group"
-
Name it:
/custom-service-logs
đģ CLI:
2ī¸âŖ Create a Log Stream¶
đĨī¸ Console:
-
Click your log group
/custom-service-logs -
Click "Create log stream" â Name it
stream-01
đģ CLI:
3ī¸âŖ Put Custom Log Events¶
đĨī¸ Console:
You can't directly push logs from the console; use SDKs/CLI or from an app.
đģ CLI:
aws logs put-log-events \
--log-group-name /custom-service-logs \
--log-stream-name stream-01 \
--log-events timestamp=$(date +%s%3N),message="Service started successfully" \
--sequence-token <nextSequenceToken>
đ You need a --sequence-token from the previous response or:
aws logs describe-log-streams \
--log-group-name /custom-service-logs \
--log-stream-name-prefix stream-01 \
--query 'logStreams[0].uploadSequenceToken'
4ī¸âŖ View in Logs Insights¶
đĨī¸ Console:
-
Go to CloudWatch > Logs Insights
-
Choose
/custom-service-logs -
Run this query:
đģ CLI:
aws logs start-query \
--log-group-name /custom-service-logs \
--start-time $(date -d '5 minutes ago' +%s) \
--end-time $(date +%s) \
--query-string "fields @timestamp, @message | sort @timestamp desc | limit 10"
5ī¸âŖ View in Live Tail¶
đĨī¸ Console:
-
Go to CloudWatch > Log groups
-
Click
/custom-service-logsâ Live Tail
đģ CLI: Not supported. Use tail via SDKs or tools like cw tail.
đ Log Subscriptions¶
Used to stream logs to Lambda, Kinesis, or Firehose (S3)
6ī¸âŖ Integration: CloudWatch Logs â Lambda¶
đĨī¸ Console:
-
Go to
/custom-service-logs -
Click "Subscription filters" > "Create"
-
Choose destination: Lambda function
-
Filter:
?ERRORor""for all logs
đģ CLI:
aws logs put-subscription-filter \
--log-group-name "/custom-service-logs" \
--filter-name "LambdaSub" \
--filter-pattern "?ERROR" \
--destination-arn "arn:aws:lambda:us-east-1:123456789012:function:logErrorNotifier"
7ī¸âŖ Integration: CloudWatch Logs â Kinesis Stream¶
đĨī¸ Console:
-
Go to your log group â "Subscription filters" â Create
-
Choose Kinesis stream
-
Provide ARN and IAM role
đģ CLI:
aws logs put-subscription-filter \
--log-group-name "/custom-service-logs" \
--filter-name "KinesisSub" \
--filter-pattern "" \
--destination-arn "arn:aws:kinesis:us-east-1:123456789012:stream/LogStream" \
--role-arn "arn:aws:iam::123456789012:role/CloudWatchToKinesisRole"
8ī¸âŖ Integration: CloudWatch Logs â S3 via Firehose¶
đĨī¸ Console:
-
Go to Firehose â Create delivery stream
-
Destination: S3
-
Go to log group
/custom-service-logs -
Create Subscription Filter â Destination: Firehose
đģ CLI:
aws logs put-subscription-filter \
--log-group-name "/custom-service-logs" \
--filter-name "S3Sub" \
--filter-pattern "" \
--destination-arn "arn:aws:firehose:us-east-1:123456789012:deliverystream/LogToS3" \
--role-arn "arn:aws:iam::123456789012:role/CloudWatchToFirehoseRole"
đĻ Architecture Summary¶
App or CLI â Log Group â Log Stream
â
+--------+--------+
| Logs Insights |
| + Live Tail |
+--------+--------+
â
+----------+----------+-----------+
| | | |
Lambda Kinesis Firehose (future)
(AlertFn) (Analytics) â S3 (Athena/Glue)