Solved Managing Public Github project with private files
-
I would like to make one of my github projects public, but I want to hide a few files. I know that is easy enough to do with
.gitignore
. In this case, I would like to have configuration files with default values shown on the project for anyone to clone. However, I want my versions to be custom.Is this solution to create a default file, sync, then add to
.gitignore
or do I need to do something else?I guess another question, is there a way to eat my cake and have it, too? I would like be able to sync this config files for myself somehow. Can I maybe make a private branch and somehow push all changes except those files to my public branch?
-
Too early for me to confirm if this is what you’re asking.
https://24ways.org/2013/keeping-parts-of-your-codebase-private-on-github/Here’s how to do it in real-time
http://showterm.io/04130676d3401229e7df6 -
@black3dynamite said in Managing Public Github project with private files:
Too early for me to confirm if this is what you’re asking.
https://24ways.org/2013/keeping-parts-of-your-codebase-private-on-github/Here’s how to do it in real-time
http://showterm.io/04130676d3401229e7df6Yep that is exactly what I was looking to do.
-
I've been getting around this by passing everything as parameters to the scripts, where the parameters are defined by variables as part of the pipeline and not from anything in Git.
-
Lots of other ways as well. Lamda triggered by web hooks, even ci/CD. As part of an event watcher reaction, etc. Basically via anything that can store variables you set in advance, or something that can provide them like a web hook depending on what you want to send.
-
@Obsolesce said in Managing Public Github project with private files:
I've been getting around this by passing everything as parameters to the scripts, where the parameters are defined by variables as part of the pipeline and not from anything in Git.
Exactly.
My local config file is the dummy one because I just run unit tests with mocks locally. The pipeline has real values in private variables and creates the config file and runs the code for integration tests and production.
-
If you wanted to do more of a true 12 factor native approach, you would use environment variables. That way you can easily change deployments without changing any code base at all.
-
So for example, I have a dot file in my home directory I use to set env variables for my Terraform deployments. It sets the particulars for the provider auth and the backend auth. That way I can just do
source ~/.terraform
and I have all of that set. I don't keep the password in there. I do:export ENV_VAR="this is the password"
and I include the space in front. That way the line isn't recorded in my history. Now when I deploy with our CI/CD process I can inject the environment variables directly into the pipeline.
-
@stacksofplates said in Managing Public Github project with private files:
when I deploy with our CI/CD process I can inject the environment variables directly into the pipeline.
Yeah, this is what I'm doing, one of the things. Other stuff is saved in a Credential Vault can called from there via the scripts.
-
I am going to need some hand holding on this one :face_with_tears_of_joy: :face_with_tears_of_joy:
-
@stacksofplates said in Managing Public Github project with private files:
So for example, I have a dot file in my home directory I use to set env variables for my Terraform deployments. It sets the particulars for the provider auth and the backend auth. That way I can just do
source ~/.terraform
and I have all of that set. I don't keep the password in there. I do:export ENV_VAR="this is the password"
and I include the space in front. That way the line isn't recorded in my history. Now when I deploy with our CI/CD process I can inject the environment variables directly into the pipeline.
Would you mind taking a look at my github project?
-
You can do both too. Here's something I use a good bit for credentials:
user := os.Getenv("THE_USER") password := os.Getenv("THE_PASSWORD") if user == "" || password == "" { user = config.User password = config.Password }
That checks for both the user and password at the same time and if either is empty it uses the config. You can check both separately also.
-
@IRJ said in Managing Public Github project with private files:
@stacksofplates said in Managing Public Github project with private files:
So for example, I have a dot file in my home directory I use to set env variables for my Terraform deployments. It sets the particulars for the provider auth and the backend auth. That way I can just do
source ~/.terraform
and I have all of that set. I don't keep the password in there. I do:export ENV_VAR="this is the password"
and I include the space in front. That way the line isn't recorded in my history. Now when I deploy with our CI/CD process I can inject the environment variables directly into the pipeline.
Would you mind taking a look at my github project?
yeah I can. I'll look when I get back from lunch.
-
@Obsolesce said in Managing Public Github project with private files:
@stacksofplates said in Managing Public Github project with private files:
when I deploy with our CI/CD process I can inject the environment variables directly into the pipeline.
Yeah, this is what I'm doing, one of the things. Other stuff is saved in a Credential Vault can called from there via the scripts.
yeah I use Jenkins creds a good bit also. It depends on the project as to how I deploy that.
-
@stacksofplates said in Managing Public Github project with private files:
If you wanted to do more of a true 12 factor native approach, you would use environment variables. That way you can easily change deployments without changing any code base at all.
Good point
I'm used to having to create scripts to inject environment variables into config files for my docker containers because I'm often working with technologies you have to rig up to work well with containers (looking at you IIS)
-
So there's a few ways to do what you want. But for Terraform specifically, the best thing to do is create your repeatable and public code and put it in a module. This means you'd have two repositories. One that is the skeleton for your infrastructure, and one that holds all of the values you need. Lets's say you have a module stored on github at
github.com/test/module
. When you write your main.tf for your private repo you would call it like this:provider "aws" { region = var.region } module "infra-is-awesome" { source = "github.com/test/module" var1 = "10.0.25.0/24" var2 = "Server01" }
Then when you do
terraform init
it will pull in your module and map the variables for you.Now what I would personally recommend is using environment variables for your credentials and anything else you want to expose. So for AWS, Terraform accepts
AWS_ACCESS_KEY_ID
andAWS_SECRET_ACCESS_KEY
. I'd recommend putting this in a dot file in your home directory (or somewhere) like here~/.terraform
:#!/usr/bin/env bash export AWS_ACCESS_KEY_ID="my access key id" export AWS_SECRET_ACCESS_KEY="my super secret key"
Then when you want to run Terraform, all you have to do before you run it is
source ~/.terraform
. This will last for as long as you have that shell open. If you close the terminal and open it again, you just need to re-run that command. You can add it to your~/.bash_profile
or whatever, but you may not want it exported all of the time.Terraform also lets you export environment variables for your regular variables. I don't usually do this, but you can do something like
export TF_VAR_region=us-east-1
. That would map tovar.region
instead of needing to type it in.My advice is to leverage modules as much as possible and keep your private data in a separate repo and just pass that data in as variables to your module(s).