How to Fix Terraform Helm and Kubernetes Providers after implementing AAD RBAC in AKS

How to Fix Terraform Helm and Kubernetes Providers after implementing AAD RBAC in AKS

I implemented the Azure Active Directory (AAD) integration with Azure Kubernetes Service (AKS) for Role Based Access Control (RBAC) today in an existing cluster. I use Terraform to manage the cluster as well as all of the Helm installs and custom resource definitions that I apply via kubectl.

The Problem

Before implementing AAD, I was just using the built in RBAC of Kubernetes, and everything was working great. After I implemented AAD, Terraform could no longer run terraform plan. The error messages I received were Error: Unauthorized and Error: Kubernetes cluster unreachable: the server has asked for the client to provide credentials. The cause of this was the Helm and Kubernetes providers. I was passing credentials that no longer had the privileges that those two providers needed to perform their functions. The AKS resource documentation from Terraform says this:

"It's possible to use these credentials with the Kubernetes Provider like so:"

provider "kubernetes" {
  load_config_file       = "false"
  host                   = azurerm_kubernetes_cluster.main.kube_config.0.host
  username               = azurerm_kubernetes_cluster.main.kube_config.0.username
  password               = azurerm_kubernetes_cluster.main.kube_config.0.password
  client_certificate     = "${base64decode(azurerm_kubernetes_cluster.main.kube_config.0.client_certificate)}"
  client_key             = "${base64decode(azurerm_kubernetes_cluster.main.kube_config.0.client_key)}"
  cluster_ca_certificate = "${base64decode(azurerm_kubernetes_cluster.main.kube_config.0.cluster_ca_certificate)}"
}

This can be translated for the Helm provider as well.

Once you implement AAD RBAC, it's going to go 🔥. But there's a very simple fix.

The Fix

The Terraform documentation doesn't explain or highlight this at all, but the fix is incredibly simple. Every place in the provider where you provide it with kube_config, you just need to use kube_admin_config instead. So for the username in the block above, you would use azurerm_kubernetes_cluster.main.kube_admin_config.0.username

Replace all instances of kube_config with kube_admin_config and you're done. 🐿

Happy Kuberneting!

Read more