So there's a few ways to do what you want. But for Terraform specifically, the best thing to do is create your repeatable and public code and put it in a module. This means you'd have two repositories. One that is the skeleton for your infrastructure, and one that holds all of the values you need. Lets's say you have a module stored on github at github.com/test/module. When you write your main.tf for your private repo you would call it like this:
provider "aws" {
region = var.region
}
module "infra-is-awesome" {
source = "github.com/test/module"
var1 = "10.0.25.0/24"
var2 = "Server01"
}
Then when you do terraform init it will pull in your module and map the variables for you.
Now what I would personally recommend is using environment variables for your credentials and anything else you want to expose. So for AWS, Terraform accepts AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY. I'd recommend putting this in a dot file in your home directory (or somewhere) like here ~/.terraform:
#!/usr/bin/env bash
export AWS_ACCESS_KEY_ID="my access key id"
export AWS_SECRET_ACCESS_KEY="my super secret key"
Then when you want to run Terraform, all you have to do before you run it is source ~/.terraform. This will last for as long as you have that shell open. If you close the terminal and open it again, you just need to re-run that command. You can add it to your ~/.bash_profile or whatever, but you may not want it exported all of the time.
Terraform also lets you export environment variables for your regular variables. I don't usually do this, but you can do something like export TF_VAR_region=us-east-1. That would map to var.region instead of needing to type it in.
My advice is to leverage modules as much as possible and keep your private data in a separate repo and just pass that data in as variables to your module(s).