Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Create SOPS file with TF generated secrets #50

Open
ztmr opened this issue Aug 28, 2020 · 11 comments
Open

Create SOPS file with TF generated secrets #50

ztmr opened this issue Aug 28, 2020 · 11 comments
Labels
enhancement New feature or request

Comments

@ztmr
Copy link

ztmr commented Aug 28, 2020

Is there any way to produce a SOPS encrypted file from Terraform?

I can certainly do this by using local_file module/resource and call SOPS just after the Terraform has dumped this in plaintext. It would be much more convenient to do it directly, given we already have this SOPS provider.

@carlpett
Copy link
Owner

Hey @ztmr,
Not at the moment, but it'd definitely be in scope to add as a resource if there is a good use case.
One snag will be that the SOPS API doesn't expose the encryption parts yet, however, so you might need to work with upstream to make it possible.

@carlpett carlpett added the enhancement New feature or request label Aug 28, 2020
@Fuuzetsu
Copy link

Fuuzetsu commented Oct 13, 2020

A sample use-case is to generate some data via terraform and output it onto file-system encrypted. This can then be subsequently consumed by something like helm with secrets plugin which would decode it at kubernetes deploy time.

@ztmr
Copy link
Author

ztmr commented Oct 13, 2020

I don't like calling OS commands from Terraform, but this is a simplified version of what we have done as a workaround for now:

resource "local_file" "environment-infra-yaml" {
  sensitive_content = yamlencode({ some = 1, data = 2 })
  filename        = "environment-infra.yaml"
  file_permission = "0640"
}

resource "null_resource" "sops-encrypt-infra-secrets" {
  depends_on = [local_file.environment-infra-yaml]

  triggers = {
    # This is to handle updates
    environment_infra_yaml_updated = local_file.environment-infra-yaml.id
  }

  provisioner "local-exec" {
    command = "sops -e -i environment-infra.yaml"
  }
}

Definitely better than hooking the app deployment to Terraform directly -- it allows to keep infra and apps provisioning separated and allows for some debugging too.

@schollii
Copy link

schollii commented Mar 26, 2021

@ztmr problem with local-exec is that the file gets regenerated and re-encrypted at every run:

  • with the above implementation, the file generated by local_file.environment-infra-yaml gets overwritten by local-exec, so next terraform apply makes it think it has to re-generate it which in turn causes file to be re-encrypted.
  • if you use sops --output to send the encrypted output to another file, this is somewhat dangerous as the unencrypted file remains on the system after terraform apply has completed, so it could get pushed to a git repo by mistake. If you add an rm command to local-exec to remove the un-encrypted file, you're back to the previous item, ie the next terraform apply will see the local_file.environment-infra-yaml missing so it will regenerate it and again re-encrypt same file.

If the provider just had a resource type for sops file with a parameter for content to be encrypted, then it could decrypt existing in memory and compare with contents parameter and only regenerate if necessary.

I tried trigger on the content of the file, and this works but has its own problems: if you remove the encrypted file but the content has not changed, the file does not get regenerated.

@ztmr
Copy link
Author

ztmr commented Mar 26, 2021

@schollii you are absolutely right!

Our use-case however is to run this in GitOps-style pipeline, so there is no human interaction or any pre-existing files. That's why we chose not to use --output (the risk of having it included in the artifacts accidentally) and why we're OK with TF creating and encrypting the files in pseudo-atomic way (inplace within one single TF operation done from within a single pipeline job).

I am still hoping to see a native solution without any intermediate files at some point! 😉

@schollii
Copy link

@politician
Copy link

An issue was opened there

@iverberk
Copy link

iverberk commented Aug 14, 2022

I'd like to add a TF snippet that we are going to use, hopefully it's helpful to anyone who wants to encrypt files with SOPS via Terraform. As mentioned earlier, the SOPS Go package does not expose encrypt functionality so it will not be possible to embed it in this provider. Calling the sops binary directly is what remains.

locals {
  config = templatefile("template/config.yaml", { non-secure = "foo" secure = "bar"})
}

resource "null_resource" "sops" {

  provisioner "local-exec" {
    command = "echo \"$CONTENT\" | sops --encrypt --encrypted-regex \"$ENCRYPTED_REGEX\" --age \"$AGE_KEY\" --input-type yaml --output-type yaml /dev/stdin > \"$OUTPUT_FILE\""

    environment = {
      CONTENT = local.config
      AGE_KEY = var.age_key
      ENCRYPTED_REGEX = "^secure$"
      OUTPUT_FILE = "secure-config.yaml"
    }
  }

  triggers = {
    content=sha1(local.config)
  }
}

This code fills a template and passes the content via standard input to SOPS. It checks on the hash of the content to see if the file needs to be regenerated, clean and simple.

@JDavis10213
Copy link

https://github.com/lokkersp/terraform-provider-sops which appears to be a fork of this repo actually adds the writing of the file. Could we maybe merge these two workstream's so we can have the full functionality?

@jodem
Copy link

jodem commented Sep 20, 2023

@iverberk problem is that local-exec is done, not at plan time but at apply time. Let's say I want to use the result of the encrypted file to put he value on S3 (ex if I use terraform to template a flux S3 source bucket). Then the problem is the local-exec cannot be used as an output, and the file gets updated only after the apply (I would have to apply 2 times to get the change, it's not consistent).

Would be really great to have the capacity for the plugin to generate in memory SOPS file at plan time.

@jodem
Copy link

jodem commented Sep 20, 2023

For posterity I have a solution that works with an "external" instead of a "null_ressource" that allow to call SOPS at plan time.

// template all files locally
module "template_infrastructure_files" {
  source = "hashicorp/dir/template"
  version = "1.0.2"

  base_dir = "${path.module}/s3-structure/infrastructure"
  template_vars = {
    # Pass in any values that you wish to use in your templates.
    traefik_user_secret_token = var.traefik_dashboard_secret
  }
}

// this filter only the file that include -secret in the name to know which one we need to call SOPS on
locals {
  // this list all files of the infra directory that do not contains secret
  infrastructure_file_not_encrypted = {for k, v in module.template_infrastructure_files.files : k => v if  length(regexall("-secret",k))<=0 }
  infrastructure_file_to_encrypt = {for k, v in module.template_infrastructure_files.files : k => v if  length(regexall("-secret",k))>0 }

}

// execute local bash to call SOPS
data "external" "encrypt_infra_secrets" {
  for_each = local.infrastructure_file_to_encrypt

  program = ["bash", "${path.module}/encrypt_with_sops.sh"]

  query = {
    content        = each.value.content # Mean each value must come from a templates , if not we could also pass the source and update script accordingly
    encrypted_regex = "^(data|stringData)$"
    bucket_path = each.key
    kms_arn = var.sops_kms_arn
  }
}

// Ex of usage with S3 to upload to the bucket the encrypted value

resource "aws_s3_object" "infrastructure_bucket_encrypted" {
  for_each = data.external.encrypt_infra_secrets

  bucket  = var.flux_bucket_name
  key = "infrastructure/${each.value.result.bucket_path}"
  content = each.value.result.encrypted_content
  source_hash = each.value.result["md5_hash"] # Not mandatory
}

The bash :

#!/bin/bash

# Require JQ
# Require MD5 brew install md5sha1sum (not mandatory)


# Read inputs from stdin
eval "$(jq -r '@sh "CONTENT=\(.content) ENCRYPTED_REGEX=\(.encrypted_regex) BUCKET_PATH=\(.bucket_path) SOPS_KMS_ARN=\(.kms_arn)"')"

export SOPS_KMS_ARN=$SOPS_KMS_ARN

# Encrypt the content
ENCRYPTED_CONTENT=$(echo "$CONTENT" | sops -e --encrypted-regex "$ENCRYPTED_REGEX" --input-type yaml --output-type yaml /dev/stdin)

MD5_HASH=$(echo -n "$CONTENT" | md5sum | awk '{ print $1 }')

# Return the encrypted content and bucket path as JSON
jq -n --arg md5_hash "$MD5_HASH"  --arg encrypted_content "$ENCRYPTED_CONTENT" --arg bucket_path "$BUCKET_PATH" '{"encrypted_content":$encrypted_content, "bucket_path":$bucket_path, "md5_hash":$md5_hash}'

The only drawback is that since the SOP execution always regenerate a different output, subsequent execution will end with different output and this trigger re-upload on s3.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

8 participants