Skip to content

Log groups should have encryption at rest enabled

CloudWatch Log Groups frequently contain application secrets, database connection strings, IP addresses, and authentication tokens emitted by running workloads. While AWS applies service-managed encryption by default, a customer-managed KMS key gives you explicit control over the key lifecycle, rotation schedule, and access policy. Without a CMK, you cannot revoke access to historical log data independently of IAM, nor can you enforce separation of duties between the team that writes logs and the team that reads them.

Using a CMK also improves auditability through CloudTrail records of KMS key usage and management activity, though CloudTrail doesn't capture every internal data-key operation.

Retrofit consideration

Associating a KMS key with an existing log group requires the aws_cloudwatch_log_group resource to be updated in place via the kms_key_id argument. The KMS key policy must grant logs.<region>.amazonaws.com the kms:Encrypt*, kms:Decrypt*, kms:ReEncrypt*, kms:GenerateDataKey*, and kms:Describe* actions before the association succeeds. Only newly ingested log events are encrypted with the newly associated key; previously ingested data stays encrypted under the original key material. A misconfigured key policy can block log ingestion.

Implementation

Choose the approach that matches how you manage Terraform.

Use the compliance.tf module to enforce this control by default. See get started with compliance.tf.

module "lambda" {
  source  = "soc2.compliance.tf/terraform-aws-modules/lambda/aws"
  version = ">=8.0.0"

  create_package         = false
  function_name          = "abc123"
  handler                = "index.lambda_handler"
  local_existing_package = "lambda_function.zip"
  runtime                = "python3.12"
}

module "lambda" {
  source  = "pcidss.compliance.tf/terraform-aws-modules/lambda/aws"
  version = ">=8.0.0"

  create_package         = false
  function_name          = "abc123"
  handler                = "index.lambda_handler"
  local_existing_package = "lambda_function.zip"
  runtime                = "python3.12"
}

module "lambda" {
  source  = "hipaa.compliance.tf/terraform-aws-modules/lambda/aws"
  version = ">=8.0.0"

  create_package         = false
  function_name          = "abc123"
  handler                = "index.lambda_handler"
  local_existing_package = "lambda_function.zip"
  runtime                = "python3.12"
}

module "lambda" {
  source  = "gdpr.compliance.tf/terraform-aws-modules/lambda/aws"
  version = ">=8.0.0"

  create_package         = false
  function_name          = "abc123"
  handler                = "index.lambda_handler"
  local_existing_package = "lambda_function.zip"
  runtime                = "python3.12"
}

module "lambda" {
  source  = "nist80053.compliance.tf/terraform-aws-modules/lambda/aws"
  version = ">=8.0.0"

  create_package         = false
  function_name          = "abc123"
  handler                = "index.lambda_handler"
  local_existing_package = "lambda_function.zip"
  runtime                = "python3.12"
}

module "lambda" {
  source  = "nistcsf.compliance.tf/terraform-aws-modules/lambda/aws"
  version = ">=8.0.0"

  create_package         = false
  function_name          = "abc123"
  handler                = "index.lambda_handler"
  local_existing_package = "lambda_function.zip"
  runtime                = "python3.12"
}

module "lambda" {
  source  = "fedrampmoderate.compliance.tf/terraform-aws-modules/lambda/aws"
  version = ">=8.0.0"

  create_package         = false
  function_name          = "abc123"
  handler                = "index.lambda_handler"
  local_existing_package = "lambda_function.zip"
  runtime                = "python3.12"
}

module "lambda" {
  source  = "nist800171.compliance.tf/terraform-aws-modules/lambda/aws"
  version = ">=8.0.0"

  create_package         = false
  function_name          = "abc123"
  handler                = "index.lambda_handler"
  local_existing_package = "lambda_function.zip"
  runtime                = "python3.12"
}

module "lambda" {
  source  = "cisacyberessentials.compliance.tf/terraform-aws-modules/lambda/aws"
  version = ">=8.0.0"

  create_package         = false
  function_name          = "abc123"
  handler                = "index.lambda_handler"
  local_existing_package = "lambda_function.zip"
  runtime                = "python3.12"
}

module "lambda" {
  source  = "nydfs23.compliance.tf/terraform-aws-modules/lambda/aws"
  version = ">=8.0.0"

  create_package         = false
  function_name          = "abc123"
  handler                = "index.lambda_handler"
  local_existing_package = "lambda_function.zip"
  runtime                = "python3.12"
}

module "lambda" {
  source  = "acscessentialeight.compliance.tf/terraform-aws-modules/lambda/aws"
  version = ">=8.0.0"

  create_package         = false
  function_name          = "abc123"
  handler                = "index.lambda_handler"
  local_existing_package = "lambda_function.zip"
  runtime                = "python3.12"
}

module "lambda" {
  source  = "eugmpannex11.compliance.tf/terraform-aws-modules/lambda/aws"
  version = ">=8.0.0"

  create_package         = false
  function_name          = "abc123"
  handler                = "index.lambda_handler"
  local_existing_package = "lambda_function.zip"
  runtime                = "python3.12"
}

module "lambda" {
  source  = "cfrpart11.compliance.tf/terraform-aws-modules/lambda/aws"
  version = ">=8.0.0"

  create_package         = false
  function_name          = "abc123"
  handler                = "index.lambda_handler"
  local_existing_package = "lambda_function.zip"
  runtime                = "python3.12"
}

module "lambda" {
  source  = "rbicybersecurity.compliance.tf/terraform-aws-modules/lambda/aws"
  version = ">=8.0.0"

  create_package         = false
  function_name          = "abc123"
  handler                = "index.lambda_handler"
  local_existing_package = "lambda_function.zip"
  runtime                = "python3.12"
}

module "lambda" {
  source  = "fedramplow.compliance.tf/terraform-aws-modules/lambda/aws"
  version = ">=8.0.0"

  create_package         = false
  function_name          = "abc123"
  handler                = "index.lambda_handler"
  local_existing_package = "lambda_function.zip"
  runtime                = "python3.12"
}

module "lambda" {
  source  = "hipaasecurity2003.compliance.tf/terraform-aws-modules/lambda/aws"
  version = ">=8.0.0"

  create_package         = false
  function_name          = "abc123"
  handler                = "index.lambda_handler"
  local_existing_package = "lambda_function.zip"
  runtime                = "python3.12"
}

module "lambda" {
  source  = "nistcsfv11.compliance.tf/terraform-aws-modules/lambda/aws"
  version = ">=8.0.0"

  create_package         = false
  function_name          = "abc123"
  handler                = "index.lambda_handler"
  local_existing_package = "lambda_function.zip"
  runtime                = "python3.12"
}

module "lambda" {
  source  = "nist80053rev4.compliance.tf/terraform-aws-modules/lambda/aws"
  version = ">=8.0.0"

  create_package         = false
  function_name          = "abc123"
  handler                = "index.lambda_handler"
  local_existing_package = "lambda_function.zip"
  runtime                = "python3.12"
}

module "lambda" {
  source  = "pcidssv321.compliance.tf/terraform-aws-modules/lambda/aws"
  version = ">=8.0.0"

  create_package         = false
  function_name          = "abc123"
  handler                = "index.lambda_handler"
  local_existing_package = "lambda_function.zip"
  runtime                = "python3.12"
}

If you use terraform-aws-modules/lambda/aws, set the right module inputs for this control. You can later migrate to the compliance.tf module with minimal changes because it is compatible by design.

module "lambda" {
  source  = "terraform-aws-modules/lambda/aws"
  version = ">=8.0.0"

  create_package         = false
  function_name          = "abc123"
  handler                = "index.lambda_handler"
  local_existing_package = "lambda_function.zip"
  runtime                = "python3.12"
}

Use AWS provider resources directly. See docs for the resources involved: aws_cloudwatch_log_group.

resource "aws_cloudwatch_log_group" "this" {
  kms_key_id        = "arn:aws:kms:us-east-1:123456789012:key/12345678-1234-1234-1234-123456789012"
  name              = "/pofix/abc123"
  retention_in_days = 365
}

What this control checks

The policy checks every aws_cloudwatch_log_group for a non-null kms_key_id set to a valid KMS key ARN. A log group that omits kms_key_id or sets it to an empty string fails. To pass, declare an aws_kms_key resource (or reference an existing key ARN) and set kms_key_id on the log group:

aws_cloudwatch_log_group with kms_key_id = aws_kms_key.logs.arn passes. The referenced aws_kms_key must have a key policy granting the CloudWatch Logs service principal (logs.<region>.amazonaws.com) permission to use the key. Without that policy statement, key association or subsequent log ingestion can fail.

Common pitfalls

  • KMS key policy missing CloudWatch Logs service principal

    Without the right key policy, AssociateKmsKey fails, or subsequent PutLogEvents calls fail with KMS access errors. The policy must explicitly allow logs.<region>.amazonaws.com to perform kms:Encrypt*, kms:Decrypt*, kms:ReEncrypt*, kms:GenerateDataKey*, and kms:Describe*.

  • Cross-region key references fail silently

    A KMS key ARN in us-east-1 cannot encrypt a log group in us-west-2. CloudWatch Logs requires the KMS key to reside in the same region as the log group. Terraform will report an error on apply, but multi-region modules that pass a single key ARN everywhere often hit this.

  • Importing existing log groups without kms_key_id

    After importing a pre-existing aws_cloudwatch_log_group, kms_key_id will be empty if no CMK was previously associated. Terraform won't propose adding it unless you explicitly set it in configuration. Once you do, terraform plan shows the required update.

  • Lambda and ECS auto-created log groups bypass Terraform

    Lambda (/aws/lambda/<fn>) and ECS create log groups automatically on first invocation, and those groups never get a CMK. The fix is to declare aws_cloudwatch_log_group resources before the service launches. Alternatively, enforce an SCP that denies logs:CreateLogGroup to non-admin roles so auto-creation can't happen.

Audit evidence

Auditors expect an AWS Config rule evaluation (such as cloud-watch-log-group-encrypted) showing all log groups COMPLIANT, or equivalent output from an open-source scanner like Prowler or Steampipe. Console evidence should show the "Encryption" field on each log group populated with a KMS key ARN rather than "Default (AWS-managed)". CloudTrail events for AssociateKmsKey confirm when encryption was applied, and kms:Decrypt events tied to the logs.amazonaws.com service principal show the CMK is actively in use.

For periodic review, auditors may also request the KMS key policy document to verify that only authorized principals have decrypt access and that key rotation is enabled.

Framework-specific interpretation

SOC 2: CC6.1 and CC6.6 both touch logical access controls and system operations. CMK-based encryption on log groups, backed by CloudTrail evidence of key usage, supports SOC 2 evaluation of confidentiality and security commitments, alongside broader policy and control requirements.

PCI DSS v4.0: Requirement 3.5 mandates strong cryptography for stored account data, and log groups capturing cardholder data environment traffic or application output fall in scope. A CMK with defined access policies and rotation addresses 3.5.1 and 3.6.1 key management expectations, but full PCI DSS compliance requires additional controls beyond encryption alone.

HIPAA Omnibus Rule 2013: 45 CFR 164.312(a)(2)(iv) treats encryption as an addressable safeguard for ePHI at rest. Log groups ingesting application or audit output from healthcare workloads can contain ePHI. Associating a CMK provides the documented encryption mechanism and key management process that auditors ask to see.

GDPR: Article 32 calls for appropriate technical measures to secure personal data processing. Log groups frequently contain IP addresses, user identifiers, and request payloads that qualify as personal data. Encrypting them with a CMK gives you a documented, auditable encryption layer that supports data protection impact assessments.

NIST SP 800-53 Rev 5: SC-28 (Protection of Information at Rest) and SC-12 (Cryptographic Key Establishment and Management) both apply. A CMK on CloudWatch Log Groups puts key material under organizational control rather than delegating it entirely to AWS-managed key operations.

NIST Cybersecurity Framework v2.0: PR.DS calls for protecting data at rest through encryption. The CMK adds key management governance that also feeds into ID.AM by tying encryption to identifiable, auditable key resources rather than opaque service-managed keys.

FedRAMP Moderate Baseline Rev 4: FedRAMP Moderate inherits SC-28 and requires FIPS 140-2 validated encryption for data at rest. AWS KMS uses FIPS-validated HSMs, so associating a CMK with CloudWatch Log Groups directly addresses this control.

Tool mappings

Use these identifiers to cross-reference this control across tools, reports, and evidence.

  • Compliance.tf Control: log_group_encryption_at_rest_enabled

  • AWS Config Managed Rule: CLOUDWATCH_LOG_GROUP_ENCRYPTED

  • Checkov Check: CKV_AWS_158

  • Powerpipe Control: aws_compliance.control.log_group_encryption_at_rest_enabled

  • Prowler Check: cloudwatch_log_group_kms_encryption_enabled

  • KICS Query: 0afbcfe9-d341-4b92-a64c-7e6de0543879

  • Trivy Check: AWS-0017

Last reviewed: 2026-03-09