When to use
Your organization might want to reduce costs by shutting down
Another use case might be real-time sending of some specific CloudWatch log entries to another AWS account outside your organization.
Solution
Whatever the case, if real-time streaming of the log events is needed, the solution is to use Kinesis. If the destination is known and is part of the list found at here, Kinesis Data Firehose should be used, as there is no additional work needed for integration. If some processing is needed before storing the logs, then Kinesis Data Streams should be used, as it can integrate with Lambda (and more) for stream processing.
In this post we are going to configure our infrastructure with Terraform to support real-time streaming of CloudWatch logs to another AWS account, via Kinesis Data Streams.

The application, which can be deployed on ECS, will push the logs to CloudWatch, which will in turn push them further to Kinesis Data Streams via a CloudWatch subscription filter. The other AWS account can subscribe to the stream, providing it has the permission. Any AWS service that can push logs to CloudWatch can be a candidate for log streaming. We can use EKS, EC2, ELB, RDS and others for the same purpose.
Prerequisites
In order to implement this solution, the AWS user that will execute the Terraform script needs to have the proper permissions. For our solution, the user needs write permissions for IAM, CloudWatch & Kinesis. For simplicity, I always give admin role to the Terraform user, as it usually needs to create policies and roles.
Implementation
Note : The Terraform module covers only the part starting with CloudWatch from the diagram above. The entire code for this module can be found on my Github terraform repo. I plan to add more terraform examples here in the future.
The first step is to create the CloudWatch log group and log stream.
#application log group
resource "aws_cloudwatch_log_group" app_log_group {
name = var.cloudwatch_log_group_name
}
#application log stream
resource "aws_cloudwatch_log_stream" app_log_stream {
name = var.cloudwatch_log_stream_name
log_group_name = aws_cloudwatch_log_group.app_log_group.name
}
Next we need to give CloudWatch permissions for putting records into Kinesis. We create the policy, role and policy role attachment.
#policy that allows putting records into kinesis. will be used by cloudwatch
resource "aws_iam_policy" cloudwatch_to_kinesis_policy {
name = "cloudwatch_to_${var.kinesis_stream_name}_policy"
description = "Policy used for allowing cloudwatch to put records into kinesis stream"
policy = jsonencode({
Version : "2012-10-17",
Statement : [
{
Action : [
"kinesis:PutRecord"
],
Effect : "Allow",
Resource : [
aws_kinesis_stream.kinesis_stream.arn
]
}
]
})
}
#allow cloudwatch to assume this role by attaching assume role policy
resource "aws_iam_role" cloudwatch_to_kinesis_role {
name = "cloudwatch_to_${var.kinesis_stream_name}_role"
assume_role_policy = jsonencode({
Statement : {
Action : "sts:AssumeRole",
Effect : "Allow",
Principal : {
Service : "logs.${var.region}.amazonaws.com"
},
Condition : {
StringLike : {
"aws:SourceArn" : "arn:aws:logs:${var.region}:${var.caller_identity.account_id}:*"
}
}
}
})
}
#attach the cloudwatch policy to the cloudwatch role
resource "aws_iam_role_policy_attachment" cloudwatch_to_kinesis_policy_attachment {
role = aws_iam_role.cloudwatch_to_kinesis_role.name
policy_arn = aws_iam_policy.cloudwatch_to_kinesis_policy.arn
}
Next we need to setup the Kinesis stream.
#kinesis stream for cloudwatch logs
resource "aws_kinesis_stream" kinesis_stream {
name = var.kinesis_stream_name
shard_count = var.kinesis_stream_shard_count
retention_period = var.kinesis_stream_retention_period
shard_level_metrics = [
"OutgoingBytes"
]
stream_mode_details {
stream_mode = "PROVISIONED"
}
encryption_type = "NONE"
}
In order to push the logs to kinesis, we need to create the CloudWatch subscription filter that will push only the log entries that match the filter pattern.
#cloudwatch subscription filter for app logs
resource "aws_cloudwatch_log_subscription_filter" app_log_subscription_filter {
name = var.cloudwatch_subscription_filter_name
role_arn = aws_iam_role.cloudwatch_to_kinesis_role.arn
log_group_name = aws_cloudwatch_log_group.app_log_group.name
filter_pattern = var.cloudwatch_subscription_filter_pattern
destination_arn = aws_kinesis_stream.kinesis_stream.arn
distribution = "ByLogStream"
}
The last step is to create the policy, role and policy attachment that the external account will use to subscribe to our Kinesis Data Stream. We will need the other AWS account root ARN.
#policy for allowing other accounts to subscribe to kinesis stream
resource "aws_iam_policy" kinesis_to_external_policy {
name = "${var.kinesis_stream_name}_to_external_policy"
description = "Policy used for allowing other accounts to access the kinesis stream"
policy = jsonencode({
Version : "2012-10-17"
Statement : [
{
Action : [
"kinesis:DescribeStream",
"kinesis:GetShardIterator",
"kinesis:GetRecords",
"kinesis:ListShards",
"kinesis:DescribeStreamSummary",
"kinesis:RegisterStreamConsumer"
]
Effect : "Allow"
Resource : [
aws_kinesis_stream.kinesis_stream.arn
]
},
{
Action : [
"kinesis:SubscribeToShard",
"kinesis:DescribeStreamConsumer"
]
Effect : "Allow"
Resource : [
"${aws_kinesis_stream.kinesis_stream.arn}/*"
]
}
]
})
}
#allow external account to assume this role by attaching assume role policy
resource "aws_iam_role" kinesis_to_external_role {
name = "${var.kinesis_stream_name}_to_external_role"
assume_role_policy = jsonencode({
Statement : {
Action : "sts:AssumeRole"
Effect : "Allow",
Principal : {
AWS : var.external_account_root_arn #(arn:aws:iam::account_number:root)
},
}
})
}
#attach the kinesis external policy to the external role
resource "aws_iam_role_policy_attachment" kinesis_to_external_policy_attachment {
role = aws_iam_role.kinesis_to_external_role.name
policy_arn = aws_iam_policy.kinesis_to_external_policy.arn
}
That is it ! You are now streaming CloudWatch logs to another AWS account.