Compare commits

...

17 Commits

Author SHA1 Message Date
k8s-infra-cherrypick-robot
fabd4b774d Make etcd node removal idempotent (#12960)
Co-authored-by: Max Gautier <mg@max.gautier.name>
2026-02-05 14:54:31 +05:30
k8s-infra-cherrypick-robot
fee4a0b425 Remove nifcloud terraform provider support (it is no longer available) (#12964)
The nifcloud terraform provider has been deleted, so remove support and
CI.

Co-authored-by: Max Gautier <mg@max.gautier.name>
2026-02-05 12:24:30 +05:30
k8s-infra-cherrypick-robot
683ee4233f wait for control plane node to become ready after joining (#12924)
When joining a control plane node and "upgrading" the cluster setup (for
example, to update etcd addresses after adding a new etcd) in the same
playbook run, the node can take a bit of time to become ready after
joining.
This triggers a kubeadm preflight check (ControlPlaneNodesReady) in
kubeadm upgrade, which is run directly after the join tasks.

Add a configurable wait for the control plane node to become Ready to
fix this race condition.

Co-authored-by: Max Gautier <mg@max.gautier.name>
2026-01-29 14:47:50 +05:30
k8s-infra-cherrypick-robot
4ff716dddd etcd-certs: only change necessary permissions (#12914)
We currently **recursively** set the permissions of /etc/ssl/etcd/ssl
(default path) to 700. But this removes group permission from the files
under it, and certain composents (like calio with etcd datastore) rely
on it ; thus, the upgrade of a cluster can fail because the
calico-kube-controller can't access the certs, and thus the etcd.

This works in other case because as far as I can tell, the apiserver
which do access the etcd run as root (the owner of the files, not just
the "group owner")

We also for some reasons do this twice.

Only create the etcd cert directory with the correct permissions once,
not recursively.

Co-authored-by: Max Gautier <mg@max.gautier.name>
2026-01-27 20:29:51 +05:30
k8s-infra-cherrypick-robot
73fcc6075d Docs: cilium_kube_proxy_replacement change boolean (#12911)
Signed-off-by: ChengHao Yang <17496418+tico88612@users.noreply.github.com>
Co-authored-by: ChengHao Yang <17496418+tico88612@users.noreply.github.com>
2026-01-27 17:03:49 +05:30
Max Gautier
a4e1a2aaaf Patch versions updates (#12895)
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
2026-01-24 09:41:27 +05:30
Kubernetes Prow Robot
2228e15860 Merge pull request #12882 from VannTen/fix/defaut_lb_address_backport
[release-2.29] Use loadbalancer IP as default apiserver endpoint if no LB hostname is used
2026-01-20 20:42:51 +05:30
k8s-infra-cherrypick-robot
f6d6351fdd cri-o: fix duplicate top-level "auths" keys in registry config template (#12886)
The config.json.j2 template was generating invalid JSON when multiple
crio_registry_auth entries were defined, resulting in multiple top-level
"auths" objects being rendered, e.g.:

{
  "auths": { "registry1": { "auth": "xxxx" } },
  "auths": { "registry2": { "auth": "yyyy" } }
}

This change moves the loop inside the "auths" object so that all registries
are rendered as siblings under a single "auths" key, producing valid JSON:

{
  "auths": {
    "registry1": { "auth": "xxxx" },
    "registry2": { "auth": "yyyy" }
  }
}

Co-authored-by: Martin Cahill <martin.cahill@gmail.com>
2026-01-20 20:16:49 +05:30
Max Gautier
051d03ead7 Fix defaults for apiserver_loadbalancer_domain_name
Since we're not longer injecting pseudo DNS into /etc/hosts,
'lb-apiserver.kubernetes.local' (the previous default) won't resolve to
anything.

Instead, default to the loadbalancer IP if defined, or to the node local
loadbalancer if it's in use.

Make the necessary adjustements in use site to deal with ip addresses as
well as hostnames.
2026-01-20 14:27:16 +01:00
Max Gautier
afe7d927c9 Do not use apiserver LB in etcd certificates
etcd does not use the apiserver load balancer, there is no reason to
include it's DNS into etcd certificates.
2026-01-20 14:23:07 +01:00
k8s-infra-cherrypick-robot
0b199325c8 k8s-certs-renew: fix broken script (#12881)
Unproquer quoting of variable assignment make the shell interpret it as
a command ; since the variable is unused anyway, just delete it.

Co-authored-by: Max Gautier <mg@max.gautier.name>
2026-01-20 08:48:48 +05:30
Max Gautier
7303abacb3 Patch versions updates (#12855)
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
2026-01-13 21:13:41 +05:30
k8s-infra-cherrypick-robot
485031dfe4 Fix ansible-lint config error (#12865)
Co-authored-by: Max Gautier <mg@max.gautier.name>
2026-01-13 20:33:40 +05:30
k8s-infra-cherrypick-robot
5fb85dc8a5 Add rbac for calico kube-controllers to access services (#12831)
Co-authored-by: Lawik974 <loic97429@gmail.com>
2026-01-02 21:00:35 +05:30
Max Gautier
84d8746b41 Patch versions updates (#12800)
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
2025-12-20 00:26:31 -08:00
k8s-infra-cherrypick-robot
8181d8c688 Upgrade cilium from 1.18.4 to 1.18.5 (#12804)
Signed-off-by: Ali Afsharzadeh <afsharzadeh8@gmail.com>
Co-authored-by: Ali Afsharzadeh <afsharzadeh8@gmail.com>
2025-12-19 07:44:34 -08:00
ChengHao Yang
c4c3205a71 Releng: galaxy version to 2.29.2 (#12786)
Signed-off-by: ChengHao Yang <17496418+tico88612@users.noreply.github.com>
2025-12-11 19:47:30 -08:00
34 changed files with 109 additions and 878 deletions

View File

@@ -1,5 +1,4 @@
---
parseable: true
skip_list:
# see https://docs.ansible.com/ansible-lint/rules/default_rules.html for a list of all default rules

View File

@@ -37,7 +37,6 @@ terraform_validate:
- hetzner
- vsphere
- upcloud
- nifcloud
.terraform_apply:
extends: .terraform_install

View File

@@ -112,14 +112,14 @@ Note:
- Core
- [kubernetes](https://github.com/kubernetes/kubernetes) 1.33.7
- [etcd](https://github.com/etcd-io/etcd) 3.5.25
- [etcd](https://github.com/etcd-io/etcd) 3.5.26
- [docker](https://www.docker.com/) 28.3
- [containerd](https://containerd.io/) 2.1.5
- [cri-o](http://cri-o.io/) 1.33.7 (experimental: see [CRI-O Note](docs/CRI/cri-o.md). Only on fedora, ubuntu and centos based OS)
- [containerd](https://containerd.io/) 2.1.6
- [cri-o](http://cri-o.io/) 1.33.8 (experimental: see [CRI-O Note](docs/CRI/cri-o.md). Only on fedora, ubuntu and centos based OS)
- Network Plugin
- [cni-plugins](https://github.com/containernetworking/plugins) 1.8.0
- [calico](https://github.com/projectcalico/calico) 3.30.5
- [cilium](https://github.com/cilium/cilium) 1.18.4
- [calico](https://github.com/projectcalico/calico) 3.30.6
- [cilium](https://github.com/cilium/cilium) 1.18.5
- [flannel](https://github.com/flannel-io/flannel) 0.27.3
- [kube-ovn](https://github.com/alauda/kube-ovn) 1.12.21
- [kube-router](https://github.com/cloudnativelabs/kube-router) 2.1.1

View File

@@ -1,5 +0,0 @@
*.tfstate*
.terraform.lock.hcl
.terraform
sample-inventory/inventory.ini

View File

@@ -1,138 +0,0 @@
# Kubernetes on NIFCLOUD with Terraform
Provision a Kubernetes cluster on [NIFCLOUD](https://pfs.nifcloud.com/) using Terraform and Kubespray
## Overview
The setup looks like following
```text
Kubernetes cluster
+----------------------------+
+---------------+ | +--------------------+ |
| | | | +--------------------+ |
| API server LB +---------> | | | |
| | | | | Control Plane/etcd | |
+---------------+ | | | node(s) | |
| +-+ | |
| +--------------------+ |
| ^ |
| | |
| v |
| +--------------------+ |
| | +--------------------+ |
| | | | |
| | | Worker | |
| | | node(s) | |
| +-+ | |
| +--------------------+ |
+----------------------------+
```
## Requirements
* Terraform 1.3.7
## Quickstart
### Export Variables
* Your NIFCLOUD credentials:
```bash
export NIFCLOUD_ACCESS_KEY_ID=<YOUR ACCESS KEY>
export NIFCLOUD_SECRET_ACCESS_KEY=<YOUR SECRET ACCESS KEY>
```
* The SSH KEY used to connect to the instance:
* FYI: [Cloud Help(SSH Key)](https://pfs.nifcloud.com/help/ssh.htm)
```bash
export TF_VAR_SSHKEY_NAME=<YOUR SSHKEY NAME>
```
* The IP address to connect to bastion server:
```bash
export TF_VAR_working_instance_ip=$(curl ifconfig.me)
```
### Create The Infrastructure
* Run terraform:
```bash
terraform init
terraform apply -var-file ./sample-inventory/cluster.tfvars
```
### Setup The Kubernetes
* Generate cluster configuration file:
```bash
./generate-inventory.sh > sample-inventory/inventory.ini
```
* Export Variables:
```bash
BASTION_IP=$(terraform output -json | jq -r '.kubernetes_cluster.value.bastion_info | to_entries[].value.public_ip')
API_LB_IP=$(terraform output -json | jq -r '.kubernetes_cluster.value.control_plane_lb')
CP01_IP=$(terraform output -json | jq -r '.kubernetes_cluster.value.control_plane_info | to_entries[0].value.private_ip')
export ANSIBLE_SSH_ARGS="-o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null -o ProxyCommand=\"ssh root@${BASTION_IP} -W %h:%p\""
```
* Set ssh-agent"
```bash
eval `ssh-agent`
ssh-add <THE PATH TO YOUR SSH KEY>
```
* Run cluster.yml playbook:
```bash
cd ./../../../
ansible-playbook -i contrib/terraform/nifcloud/inventory/inventory.ini cluster.yml
```
### Connecting to Kubernetes
* [Install kubectl](https://kubernetes.io/docs/tasks/tools/) on the localhost
* Fetching kubeconfig file:
```bash
mkdir -p ~/.kube
scp -o ProxyCommand="ssh root@${BASTION_IP} -W %h:%p" root@${CP01_IP}:/etc/kubernetes/admin.conf ~/.kube/config
```
* Rewrite /etc/hosts
```bash
sudo echo "${API_LB_IP} lb-apiserver.kubernetes.local" >> /etc/hosts
```
* Run kubectl
```bash
kubectl get node
```
## Variables
* `region`: Region where to run the cluster
* `az`: Availability zone where to run the cluster
* `private_ip_bn`: Private ip address of bastion server
* `private_network_cidr`: Subnet of private network
* `instances_cp`: Machine to provision as Control Plane. Key of this object will be used as part of the machine' name
* `private_ip`: private ip address of machine
* `instances_wk`: Machine to provision as Worker Node. Key of this object will be used as part of the machine' name
* `private_ip`: private ip address of machine
* `instance_key_name`: The key name of the Key Pair to use for the instance
* `instance_type_bn`: The instance type of bastion server
* `instance_type_wk`: The instance type of worker node
* `instance_type_cp`: The instance type of control plane
* `image_name`: OS image used for the instance
* `working_instance_ip`: The IP address to connect to bastion server
* `accounting_type`: Accounting type. (1: monthly, 2: pay per use)

View File

@@ -1,64 +0,0 @@
#!/bin/bash
#
# Generates a inventory file based on the terraform output.
# After provisioning a cluster, simply run this command and supply the terraform state file
# Default state file is terraform.tfstate
#
set -e
TF_OUT=$(terraform output -json)
CONTROL_PLANES=$(jq -r '.kubernetes_cluster.value.control_plane_info | to_entries[]' <(echo "${TF_OUT}"))
WORKERS=$(jq -r '.kubernetes_cluster.value.worker_info | to_entries[]' <(echo "${TF_OUT}"))
mapfile -t CONTROL_PLANE_NAMES < <(jq -r '.key' <(echo "${CONTROL_PLANES}"))
mapfile -t WORKER_NAMES < <(jq -r '.key' <(echo "${WORKERS}"))
API_LB=$(jq -r '.kubernetes_cluster.value.control_plane_lb' <(echo "${TF_OUT}"))
echo "[all]"
# Generate control plane hosts
i=1
for name in "${CONTROL_PLANE_NAMES[@]}"; do
private_ip=$(jq -r '. | select( .key=='"\"${name}\""' ) | .value.private_ip' <(echo "${CONTROL_PLANES}"))
echo "${name} ansible_user=root ansible_host=${private_ip} access_ip=${private_ip} ip=${private_ip} etcd_member_name=etcd${i}"
i=$(( i + 1 ))
done
# Generate worker hosts
for name in "${WORKER_NAMES[@]}"; do
private_ip=$(jq -r '. | select( .key=='"\"${name}\""' ) | .value.private_ip' <(echo "${WORKERS}"))
echo "${name} ansible_user=root ansible_host=${private_ip} access_ip=${private_ip} ip=${private_ip}"
done
API_LB=$(jq -r '.kubernetes_cluster.value.control_plane_lb' <(echo "${TF_OUT}"))
echo ""
echo "[all:vars]"
echo "upstream_dns_servers=['8.8.8.8','8.8.4.4']"
echo "loadbalancer_apiserver={'address':'${API_LB}','port':'6443'}"
echo ""
echo "[kube_control_plane]"
for name in "${CONTROL_PLANE_NAMES[@]}"; do
echo "${name}"
done
echo ""
echo "[etcd]"
for name in "${CONTROL_PLANE_NAMES[@]}"; do
echo "${name}"
done
echo ""
echo "[kube_node]"
for name in "${WORKER_NAMES[@]}"; do
echo "${name}"
done
echo ""
echo "[k8s_cluster:children]"
echo "kube_control_plane"
echo "kube_node"

View File

@@ -1,36 +0,0 @@
provider "nifcloud" {
region = var.region
}
module "kubernetes_cluster" {
source = "./modules/kubernetes-cluster"
availability_zone = var.az
prefix = "dev"
private_network_cidr = var.private_network_cidr
instance_key_name = var.instance_key_name
instances_cp = var.instances_cp
instances_wk = var.instances_wk
image_name = var.image_name
instance_type_bn = var.instance_type_bn
instance_type_cp = var.instance_type_cp
instance_type_wk = var.instance_type_wk
private_ip_bn = var.private_ip_bn
additional_lb_filter = [var.working_instance_ip]
}
resource "nifcloud_security_group_rule" "ssh_from_bastion" {
security_group_names = [
module.kubernetes_cluster.security_group_name.bastion
]
type = "IN"
from_port = 22
to_port = 22
protocol = "TCP"
cidr_ip = var.working_instance_ip
}

View File

@@ -1,301 +0,0 @@
#################################################
##
## Local variables
##
locals {
# e.g. east-11 is 11
az_num = reverse(split("-", var.availability_zone))[0]
# e.g. east-11 is e11
az_short_name = "${substr(reverse(split("-", var.availability_zone))[1], 0, 1)}${local.az_num}"
# Port used by the protocol
port_ssh = 22
port_kubectl = 6443
port_kubelet = 10250
# calico: https://docs.tigera.io/calico/latest/getting-started/kubernetes/requirements#network-requirements
port_bgp = 179
port_vxlan = 4789
port_etcd = 2379
}
#################################################
##
## General
##
# data
data "nifcloud_image" "this" {
image_name = var.image_name
}
# private lan
resource "nifcloud_private_lan" "this" {
private_lan_name = "${var.prefix}lan"
availability_zone = var.availability_zone
cidr_block = var.private_network_cidr
accounting_type = var.accounting_type
}
#################################################
##
## Bastion
##
resource "nifcloud_security_group" "bn" {
group_name = "${var.prefix}bn"
description = "${var.prefix} bastion"
availability_zone = var.availability_zone
}
resource "nifcloud_instance" "bn" {
instance_id = "${local.az_short_name}${var.prefix}bn01"
security_group = nifcloud_security_group.bn.group_name
instance_type = var.instance_type_bn
user_data = templatefile("${path.module}/templates/userdata.tftpl", {
private_ip_address = var.private_ip_bn
ssh_port = local.port_ssh
hostname = "${local.az_short_name}${var.prefix}bn01"
})
availability_zone = var.availability_zone
accounting_type = var.accounting_type
image_id = data.nifcloud_image.this.image_id
key_name = var.instance_key_name
network_interface {
network_id = "net-COMMON_GLOBAL"
}
network_interface {
network_id = nifcloud_private_lan.this.network_id
ip_address = "static"
}
# The image_id changes when the OS image type is demoted from standard to public.
lifecycle {
ignore_changes = [
image_id,
user_data,
]
}
}
#################################################
##
## Control Plane
##
resource "nifcloud_security_group" "cp" {
group_name = "${var.prefix}cp"
description = "${var.prefix} control plane"
availability_zone = var.availability_zone
}
resource "nifcloud_instance" "cp" {
for_each = var.instances_cp
instance_id = "${local.az_short_name}${var.prefix}${each.key}"
security_group = nifcloud_security_group.cp.group_name
instance_type = var.instance_type_cp
user_data = templatefile("${path.module}/templates/userdata.tftpl", {
private_ip_address = each.value.private_ip
ssh_port = local.port_ssh
hostname = "${local.az_short_name}${var.prefix}${each.key}"
})
availability_zone = var.availability_zone
accounting_type = var.accounting_type
image_id = data.nifcloud_image.this.image_id
key_name = var.instance_key_name
network_interface {
network_id = "net-COMMON_GLOBAL"
}
network_interface {
network_id = nifcloud_private_lan.this.network_id
ip_address = "static"
}
# The image_id changes when the OS image type is demoted from standard to public.
lifecycle {
ignore_changes = [
image_id,
user_data,
]
}
}
resource "nifcloud_load_balancer" "this" {
load_balancer_name = "${local.az_short_name}${var.prefix}cp"
accounting_type = var.accounting_type
balancing_type = 1 // Round-Robin
load_balancer_port = local.port_kubectl
instance_port = local.port_kubectl
instances = [for v in nifcloud_instance.cp : v.instance_id]
filter = concat(
[for k, v in nifcloud_instance.cp : v.public_ip],
[for k, v in nifcloud_instance.wk : v.public_ip],
var.additional_lb_filter,
)
filter_type = 1 // Allow
}
#################################################
##
## Worker
##
resource "nifcloud_security_group" "wk" {
group_name = "${var.prefix}wk"
description = "${var.prefix} worker"
availability_zone = var.availability_zone
}
resource "nifcloud_instance" "wk" {
for_each = var.instances_wk
instance_id = "${local.az_short_name}${var.prefix}${each.key}"
security_group = nifcloud_security_group.wk.group_name
instance_type = var.instance_type_wk
user_data = templatefile("${path.module}/templates/userdata.tftpl", {
private_ip_address = each.value.private_ip
ssh_port = local.port_ssh
hostname = "${local.az_short_name}${var.prefix}${each.key}"
})
availability_zone = var.availability_zone
accounting_type = var.accounting_type
image_id = data.nifcloud_image.this.image_id
key_name = var.instance_key_name
network_interface {
network_id = "net-COMMON_GLOBAL"
}
network_interface {
network_id = nifcloud_private_lan.this.network_id
ip_address = "static"
}
# The image_id changes when the OS image type is demoted from standard to public.
lifecycle {
ignore_changes = [
image_id,
user_data,
]
}
}
#################################################
##
## Security Group Rule: Kubernetes
##
# ssh
resource "nifcloud_security_group_rule" "ssh_from_bastion" {
security_group_names = [
nifcloud_security_group.wk.group_name,
nifcloud_security_group.cp.group_name,
]
type = "IN"
from_port = local.port_ssh
to_port = local.port_ssh
protocol = "TCP"
source_security_group_name = nifcloud_security_group.bn.group_name
}
# kubectl
resource "nifcloud_security_group_rule" "kubectl_from_worker" {
security_group_names = [
nifcloud_security_group.cp.group_name,
]
type = "IN"
from_port = local.port_kubectl
to_port = local.port_kubectl
protocol = "TCP"
source_security_group_name = nifcloud_security_group.wk.group_name
}
# kubelet
resource "nifcloud_security_group_rule" "kubelet_from_worker" {
security_group_names = [
nifcloud_security_group.cp.group_name,
]
type = "IN"
from_port = local.port_kubelet
to_port = local.port_kubelet
protocol = "TCP"
source_security_group_name = nifcloud_security_group.wk.group_name
}
resource "nifcloud_security_group_rule" "kubelet_from_control_plane" {
security_group_names = [
nifcloud_security_group.wk.group_name,
]
type = "IN"
from_port = local.port_kubelet
to_port = local.port_kubelet
protocol = "TCP"
source_security_group_name = nifcloud_security_group.cp.group_name
}
#################################################
##
## Security Group Rule: calico
##
# vslan
resource "nifcloud_security_group_rule" "vxlan_from_control_plane" {
security_group_names = [
nifcloud_security_group.wk.group_name,
]
type = "IN"
from_port = local.port_vxlan
to_port = local.port_vxlan
protocol = "UDP"
source_security_group_name = nifcloud_security_group.cp.group_name
}
resource "nifcloud_security_group_rule" "vxlan_from_worker" {
security_group_names = [
nifcloud_security_group.cp.group_name,
]
type = "IN"
from_port = local.port_vxlan
to_port = local.port_vxlan
protocol = "UDP"
source_security_group_name = nifcloud_security_group.wk.group_name
}
# bgp
resource "nifcloud_security_group_rule" "bgp_from_control_plane" {
security_group_names = [
nifcloud_security_group.wk.group_name,
]
type = "IN"
from_port = local.port_bgp
to_port = local.port_bgp
protocol = "TCP"
source_security_group_name = nifcloud_security_group.cp.group_name
}
resource "nifcloud_security_group_rule" "bgp_from_worker" {
security_group_names = [
nifcloud_security_group.cp.group_name,
]
type = "IN"
from_port = local.port_bgp
to_port = local.port_bgp
protocol = "TCP"
source_security_group_name = nifcloud_security_group.wk.group_name
}
# etcd
resource "nifcloud_security_group_rule" "etcd_from_worker" {
security_group_names = [
nifcloud_security_group.cp.group_name,
]
type = "IN"
from_port = local.port_etcd
to_port = local.port_etcd
protocol = "TCP"
source_security_group_name = nifcloud_security_group.wk.group_name
}

View File

@@ -1,48 +0,0 @@
output "control_plane_lb" {
description = "The DNS name of LB for control plane"
value = nifcloud_load_balancer.this.dns_name
}
output "security_group_name" {
description = "The security group used in the cluster"
value = {
bastion = nifcloud_security_group.bn.group_name,
control_plane = nifcloud_security_group.cp.group_name,
worker = nifcloud_security_group.wk.group_name,
}
}
output "private_network_id" {
description = "The private network used in the cluster"
value = nifcloud_private_lan.this.id
}
output "bastion_info" {
description = "The basion information in cluster"
value = { (nifcloud_instance.bn.instance_id) : {
instance_id = nifcloud_instance.bn.instance_id,
unique_id = nifcloud_instance.bn.unique_id,
private_ip = nifcloud_instance.bn.private_ip,
public_ip = nifcloud_instance.bn.public_ip,
} }
}
output "worker_info" {
description = "The worker information in cluster"
value = { for v in nifcloud_instance.wk : v.instance_id => {
instance_id = v.instance_id,
unique_id = v.unique_id,
private_ip = v.private_ip,
public_ip = v.public_ip,
} }
}
output "control_plane_info" {
description = "The control plane information in cluster"
value = { for v in nifcloud_instance.cp : v.instance_id => {
instance_id = v.instance_id,
unique_id = v.unique_id,
private_ip = v.private_ip,
public_ip = v.public_ip,
} }
}

View File

@@ -1,45 +0,0 @@
#!/bin/bash
#################################################
##
## IP Address
##
configure_private_ip_address () {
cat << EOS > /etc/netplan/01-netcfg.yaml
network:
version: 2
renderer: networkd
ethernets:
ens192:
dhcp4: yes
dhcp6: yes
dhcp-identifier: mac
ens224:
dhcp4: no
dhcp6: no
addresses: [${private_ip_address}]
EOS
netplan apply
}
configure_private_ip_address
#################################################
##
## SSH
##
configure_ssh_port () {
sed -i 's/^#*Port [0-9]*/Port ${ssh_port}/' /etc/ssh/sshd_config
}
configure_ssh_port
#################################################
##
## Hostname
##
hostnamectl set-hostname ${hostname}
#################################################
##
## Disable swap files genereated by systemd-gpt-auto-generator
##
systemctl mask "dev-sda3.swap"

View File

@@ -1,9 +0,0 @@
terraform {
required_version = ">=1.3.7"
required_providers {
nifcloud = {
source = "nifcloud/nifcloud"
version = ">= 1.8.0, < 2.0.0"
}
}
}

View File

@@ -1,81 +0,0 @@
variable "availability_zone" {
description = "The availability zone"
type = string
}
variable "prefix" {
description = "The prefix for the entire cluster"
type = string
validation {
condition = length(var.prefix) <= 5
error_message = "Must be a less than 5 character long."
}
}
variable "private_network_cidr" {
description = "The subnet of private network"
type = string
validation {
condition = can(cidrnetmask(var.private_network_cidr))
error_message = "Must be a valid IPv4 CIDR block address."
}
}
variable "private_ip_bn" {
description = "Private IP of bastion server"
type = string
}
variable "instances_cp" {
type = map(object({
private_ip = string
}))
}
variable "instances_wk" {
type = map(object({
private_ip = string
}))
}
variable "instance_key_name" {
description = "The key name of the Key Pair to use for the instance"
type = string
}
variable "instance_type_bn" {
description = "The instance type of bastion server"
type = string
}
variable "instance_type_wk" {
description = "The instance type of worker"
type = string
}
variable "instance_type_cp" {
description = "The instance type of control plane"
type = string
}
variable "image_name" {
description = "The name of image"
type = string
}
variable "additional_lb_filter" {
description = "Additional LB filter"
type = list(string)
}
variable "accounting_type" {
type = string
default = "1"
validation {
condition = anytrue([
var.accounting_type == "1", // Monthly
var.accounting_type == "2", // Pay per use
])
error_message = "Must be a 1 or 2."
}
}

View File

@@ -1,3 +0,0 @@
output "kubernetes_cluster" {
value = module.kubernetes_cluster
}

View File

@@ -1,22 +0,0 @@
region = "jp-west-1"
az = "west-11"
instance_key_name = "deployerkey"
instance_type_bn = "e-medium"
instance_type_cp = "e-medium"
instance_type_wk = "e-medium"
private_network_cidr = "192.168.30.0/24"
instances_cp = {
"cp01" : { private_ip : "192.168.30.11/24" }
"cp02" : { private_ip : "192.168.30.12/24" }
"cp03" : { private_ip : "192.168.30.13/24" }
}
instances_wk = {
"wk01" : { private_ip : "192.168.30.21/24" }
"wk02" : { private_ip : "192.168.30.22/24" }
}
private_ip_bn = "192.168.30.10/24"
image_name = "Ubuntu Server 22.04 LTS"

View File

@@ -1 +0,0 @@
../../../../inventory/sample/group_vars

View File

@@ -1,9 +0,0 @@
terraform {
required_version = ">=1.3.7"
required_providers {
nifcloud = {
source = "nifcloud/nifcloud"
version = "1.8.0"
}
}
}

View File

@@ -1,77 +0,0 @@
variable "region" {
description = "The region"
type = string
}
variable "az" {
description = "The availability zone"
type = string
}
variable "private_ip_bn" {
description = "Private IP of bastion server"
type = string
}
variable "private_network_cidr" {
description = "The subnet of private network"
type = string
validation {
condition = can(cidrnetmask(var.private_network_cidr))
error_message = "Must be a valid IPv4 CIDR block address."
}
}
variable "instances_cp" {
type = map(object({
private_ip = string
}))
}
variable "instances_wk" {
type = map(object({
private_ip = string
}))
}
variable "instance_key_name" {
description = "The key name of the Key Pair to use for the instance"
type = string
}
variable "instance_type_bn" {
description = "The instance type of bastion server"
type = string
}
variable "instance_type_wk" {
description = "The instance type of worker"
type = string
}
variable "instance_type_cp" {
description = "The instance type of control plane"
type = string
}
variable "image_name" {
description = "The name of image"
type = string
}
variable "working_instance_ip" {
description = "The IP address to connect to bastion server."
type = string
}
variable "accounting_type" {
type = string
default = "2"
validation {
condition = anytrue([
var.accounting_type == "1", // Monthly
var.accounting_type == "2", // Pay per use
])
error_message = "Must be a 1 or 2."
}
}

View File

@@ -237,7 +237,7 @@ cilium_operator_extra_volume_mounts:
## Choose Cilium version
```yml
cilium_version: "1.18.4"
cilium_version: "1.18.5"
```
## Add variable to config

View File

@@ -2,7 +2,7 @@
namespace: kubernetes_sigs
description: Deploy a production ready Kubernetes cluster
name: kubespray
version: 2.29.1
version: 2.29.2
readme: README.md
authors:
- The Kubespray maintainers (https://kubernetes.slack.com/channels/kubespray)

View File

@@ -56,8 +56,8 @@ cilium_l2announcements: false
#
# Only effective when monitor aggregation is set to "medium" or higher.
# cilium_monitor_aggregation_flags: "all"
# Kube Proxy Replacement mode (strict/partial)
# cilium_kube_proxy_replacement: partial
# Kube Proxy Replacement mode (true/false)
# cilium_kube_proxy_replacement: false
# If upgrading from Cilium < 1.5, you may want to override some of these options
# to prevent service disruptions. See also:

View File

@@ -1,16 +1,16 @@
{% if crio_registry_auth is defined and crio_registry_auth|length %}
{
{% for reg in crio_registry_auth %}
"auths": {
{% for reg in crio_registry_auth %}
"{{ reg.registry }}": {
"auth": "{{ (reg.username + ':' + reg.password) | string | b64encode }}"
}
{% if not loop.last %}
},
},
{% else %}
}
}
{% endif %}
{% endfor %}
}
}
{% else %}
{}

View File

@@ -5,8 +5,7 @@
group: "{{ etcd_cert_group }}"
state: directory
owner: "{{ etcd_owner }}"
mode: "{{ etcd_cert_dir_mode }}"
recurse: true
mode: "0700"
- name: "Gen_certs | create etcd script dir (on {{ groups['etcd'][0] }})"
file:
@@ -145,15 +144,6 @@
- ('k8s_cluster' in group_names) and
sync_certs | default(false) and inventory_hostname not in groups['etcd']
- name: Gen_certs | check certificate permissions
file:
path: "{{ etcd_cert_dir }}"
group: "{{ etcd_cert_group }}"
state: directory
owner: "{{ etcd_owner }}"
mode: "{{ etcd_cert_dir_mode }}"
recurse: true
# This is a hack around the fact kubeadm expect the same certs path on all kube_control_plane
# TODO: fix certs generation to have the same file everywhere
# OR work with kubeadm on node-specific config

View File

@@ -32,9 +32,6 @@ DNS.{{ counter["dns"] }} = {{ hostvars[host]['etcd_access_address'] }}{{ increme
{# This will always expand to inventory_hostname, which can be a completely arbitrary name, that etcd will not know or care about, hence this line is (probably) redundant. #}
DNS.{{ counter["dns"] }} = {{ host }}{{ increment(counter, 'dns') }}
{% endfor %}
{% if apiserver_loadbalancer_domain_name is defined %}
DNS.{{ counter["dns"] }} = {{ apiserver_loadbalancer_domain_name }}{{ increment(counter, 'dns') }}
{% endif %}
{% for etcd_alt_name in etcd_cert_alt_names %}
DNS.{{ counter["dns"] }} = {{ etcd_alt_name }}{{ increment(counter, 'dns') }}
{% endfor %}

View File

@@ -18,7 +18,6 @@ etcd_backup_retention_count: -1
force_etcd_cert_refresh: true
etcd_config_dir: /etc/ssl/etcd
etcd_cert_dir: "{{ etcd_config_dir }}/ssl"
etcd_cert_dir_mode: "0700"
etcd_cert_group: root
# Note: This does not set up DNS entries. It simply adds the following DNS
# entries to the certificate

View File

@@ -26,6 +26,16 @@ rules:
verbs:
- watch
- list
# Services are monitored for service LoadBalancer IP allocation
- apiGroups: [""]
resources:
- services
- services/status
verbs:
- get
- list
- update
- watch
{% elif calico_datastore == "kdd" %}
# Nodes are watched to monitor for deletions.
- apiGroups: [""]

View File

@@ -2,6 +2,9 @@
# disable upgrade cluster
upgrade_cluster_setup: false
# Number of retries (with 5 seconds interval) to check that new control plane nodes
# are in Ready condition after joining
control_plane_node_become_ready_tries: 24
# By default the external API listens on all interfaces, this can be changed to
# listen on a specific address/interface.
# NOTE: If you specific address/interface and use loadbalancer_apiserver_localhost

View File

@@ -99,3 +99,18 @@
when:
- inventory_hostname != first_kube_control_plane
- kubeadm_already_run is not defined or not kubeadm_already_run.stat.exists
- name: Wait for new control plane nodes to be Ready
when: kubeadm_already_run.stat.exists
run_once: true
command: >
{{ kubectl }} get nodes --selector node-role.kubernetes.io/control-plane
-o jsonpath-as-json="{.items[*].status.conditions[?(@.type == 'Ready')]}"
register: control_plane_node_ready_conditions
retries: "{{ control_plane_node_become_ready_tries }}"
delay: 5
delegate_to: "{{ groups['kube_control_plane'][0] }}"
until: >
control_plane_node_ready_conditions.stdout
| from_json | selectattr('status', '==', 'True')
| length == (groups['kube_control_plane'] | length)

View File

@@ -90,7 +90,7 @@
# Nginx LB(default), If kubeadm_config_api_fqdn is defined, use other LB by kubeadm controlPlaneEndpoint.
- name: Set kubeadm_config_api_fqdn define
set_fact:
kubeadm_config_api_fqdn: "{{ apiserver_loadbalancer_domain_name | default('lb-apiserver.kubernetes.local') }}"
kubeadm_config_api_fqdn: "{{ apiserver_loadbalancer_domain_name }}"
when: loadbalancer_apiserver is defined
- name: Kubeadm | Create kubeadm config

View File

@@ -5,7 +5,6 @@ echo "## Check Expiration before renewal ##"
{{ bin_dir }}/kubeadm certs check-expiration
days_buffer=7 # set a time margin, because we should not renew at the last moment
calendar={{ auto_renew_certificates_systemd_calendar }}
next_time=$(systemctl show k8s-certs-renew.timer -p NextElapseUSecRealtime --value)
if [ "${next_time}" == "" ]; then

View File

@@ -116,7 +116,7 @@ flannel_version: 0.27.3
flannel_cni_version: 1.7.1-flannel1
cni_version: "{{ (cni_binary_checksums['amd64'] | dict2items)[0].key }}"
cilium_version: "1.18.4"
cilium_version: "1.18.5"
cilium_cli_version: "{{ (ciliumcli_binary_checksums['amd64'] | dict2items)[0].key }}"
cilium_enable_hubble: false

View File

@@ -640,10 +640,10 @@ first_kube_control_plane_address: "{{ hostvars[groups['kube_control_plane'][0]][
loadbalancer_apiserver_localhost: "{{ loadbalancer_apiserver is not defined }}"
loadbalancer_apiserver_type: "nginx"
# applied if only external loadbalancer_apiserver is defined, otherwise ignored
apiserver_loadbalancer_domain_name: "lb-apiserver.kubernetes.local"
apiserver_loadbalancer_domain_name: "{{ 'localhost' if loadbalancer_apiserver_localhost else (loadbalancer_apiserver.address | d(undef())) }}"
kube_apiserver_global_endpoint: |-
{% if loadbalancer_apiserver is defined -%}
https://{{ apiserver_loadbalancer_domain_name }}:{{ loadbalancer_apiserver.port | default(kube_apiserver_port) }}
https://{{ apiserver_loadbalancer_domain_name | ansible.utils.ipwrap }}:{{ loadbalancer_apiserver.port | default(kube_apiserver_port) }}
{%- elif loadbalancer_apiserver_localhost and (loadbalancer_apiserver_port is not defined or loadbalancer_apiserver_port == kube_apiserver_port) -%}
https://localhost:{{ kube_apiserver_port }}
{%- else -%}
@@ -651,7 +651,7 @@ kube_apiserver_global_endpoint: |-
{%- endif %}
kube_apiserver_endpoint: |-
{% if loadbalancer_apiserver is defined -%}
https://{{ apiserver_loadbalancer_domain_name }}:{{ loadbalancer_apiserver.port | default(kube_apiserver_port) }}
https://{{ apiserver_loadbalancer_domain_name | ansible.utils.ipwrap }}:{{ loadbalancer_apiserver.port | default(kube_apiserver_port) }}
{%- elif ('kube_control_plane' not in group_names) and loadbalancer_apiserver_localhost -%}
https://localhost:{{ loadbalancer_apiserver_port | default(kube_apiserver_port) }}
{%- elif 'kube_control_plane' in group_names -%}

View File

@@ -17,6 +17,7 @@ crictl_checksums:
1.31.0: sha256:ed545379a61deff415172ea3ca6b847166c5d116c7a1271866286cd0242c09a2
crio_archive_checksums:
arm64:
1.33.8: sha256:59c91726535dcadd0372df0c6aa8595e4d59590994b598b2d97ea2510b216359
1.33.7: sha256:af3ea22d3d6944c9a907c6c13d77e9fc4dbcf3972ffbde18dd6f37f1c2ffbd0d
1.33.6: sha256:6ee49e746d1a5be1a664a6f801c68b169cb181a9aaf12218eed121e2b151bfdb
1.33.5: sha256:ef1b5e2162b0f55722e0966db0cfe387f3ba7cb91d6a803f627121733132792d
@@ -25,6 +26,7 @@ crio_archive_checksums:
1.33.2: sha256:0a161cb1437a50fbdb04bf5ca11dbec8bfc567871d0597a5676737278a945a36
1.33.1: sha256:6bf135db438937f0ab7a533af64564a0fb1d2079a43723ce9255ecbf9556ae05
1.33.0: sha256:8a0dbee2879495d5b33e6fdeac32e5d86c356897bdcf3a94cd602851620ce8b5
1.32.12: sha256:26a5138f4e4f15d370630c3bb8bf04fe28b24c57ce2bb11717a2c9a2e1c54404
1.32.11: sha256:25c6ccfe9b70bf12222577b4cbf286ade9e2d112ab10c7d4507ba12cbcfad5ba
1.32.10: sha256:4e8ceb6f2c936e31a9b892a076deecc52be9feac4acf8af242fb6db817fda9b1
1.32.9: sha256:f854848dc5ae54ea03e48f2bc6d6ffbea2173de45c3d7a2abbc3af3abcb779f9
@@ -52,6 +54,7 @@ crio_archive_checksums:
1.31.1: sha256:760d00cecaf1b6bf5a3bfae39daa5e46a74408f7a6869cbb41716a5610a7a18f
1.31.0: sha256:d54afe0140afde0bed09136bd923d8fb415c9016189e7f1b719565ec84edf737
amd64:
1.33.8: sha256:537adda39074377893f1f650a71b576ba487b3c4d2ee55e9b22f4e95fc188594
1.33.7: sha256:e2999436a272c77370241a4f962c80737698dd8c2400fe75e5c7cf2142c96001
1.33.6: sha256:4d0d446f73d9db6d5bf2c03ecdc39d9d702836886f4715886c15dc2f461cc810
1.33.5: sha256:b8883e51837ee7fd45c88c762f37ca4b96d80ec6a7b46ec989381089e762aa7f
@@ -60,6 +63,7 @@ crio_archive_checksums:
1.33.2: sha256:6e82739bbbeae12d571a277a88d85e8a0e23dbc87529414a91ee5f2e23792dcf
1.33.1: sha256:036063194028d24c75b9ce080e475ad97bacc955de796b7c895845294db8edbf
1.33.0: sha256:dad0cec9e09368b37b35ce824b0ef517a1b33365c4bb164fe82310c73c886f7e
1.32.12: sha256:13cb9676686c0ccd6bd7ffef9125f6370f803f08a559cf31f017193619891960
1.32.11: sha256:98424dbe3eb1377b314bb35b30842987ccc800faa2f8145d52eb2a9c1efa17be
1.32.10: sha256:b8e66bd33c885baf65535e671a120de4d7675833a75489403a9406e5fd2faa5e
1.32.9: sha256:59b861b9c8913328c9bc97b3bcb007951b0c3bf6c9f40fbad236be4b31534503
@@ -87,6 +91,7 @@ crio_archive_checksums:
1.31.1: sha256:ea51b7db06ca97ecf7a76d0341ca168dca102a21fb14f97b1fc139c8e7fb1d47
1.31.0: sha256:3cc88ce3c19b2f9bbdfaa1bd42eea64bd7d5ffac6e714a83abbdea40df9ef8c2
ppc64le:
1.33.8: sha256:1d69c01512e8ebdd51fc70fc64473a31d492e8db095c0ee5d3ee58722048150c
1.33.7: sha256:076e7519bfff72a43fb1121ce836eee3cc1fec5bb5a59a11747c514e9d162d26
1.33.6: sha256:3643eefe295604288f5b652fb9c672a60f96dc803e63edaf9ee64ed4047a50dd
1.33.5: sha256:cf85062f39d755418da0ee4f869c7a4817bf95daee6e35df53010ad29be37c88
@@ -95,6 +100,7 @@ crio_archive_checksums:
1.33.2: sha256:8ed65404a57262a9f8eb75b61afa37fcec134472eb1a6d81f1889a74ff32c651
1.33.1: sha256:12646aca33f65fe335c27d3af582c599584d3f51185f01044e7ddd0668bb2b4c
1.33.0: sha256:b4fa46b25538d8145197f8bf2e935486392c0ca2a9fa609aedd02b9f106d37a6
1.32.12: sha256:9ba4f2c3be48c0f1f3228ef6322aeb3738f3ef461fd483a0cb4c2e5b067f080c
1.32.11: sha256:6c2036f2ed7134c596b5a453a06fbb7e646db9586bff0d993f5223dccf167420
1.32.10: sha256:ae4740c6bb6f346338f94508c74d5b1ec94f2691cb12f9a9add437fee5391f8d
1.32.9: sha256:604bd6f866be327951942656931847c3623cd1e138197f153dd4d5537dd19f11
@@ -131,6 +137,7 @@ kubelet_checksums:
1.33.2: sha256:0fa15aca9b90fe7aef1ed3aad31edd1d9944a8c7aae34162963a6aaaf726e065
1.33.1: sha256:10540261c311ae005b9af514d83c02694e12614406a8524fd2d0bad75296f70d
1.33.0: sha256:ae5a4fc6d733fc28ff198e2d80334e21fcb5c34e76b411c50fff9cb25accf05a
1.32.11: sha256:7d1c3aaae0dffa8d5c90bbaed49f25d32f98332801bde55cfea6efaead639491
1.32.10: sha256:21cc3d98550d3a23052d649e77956f2557e7f6119ff1e27dc82b852d006136cd
1.32.9: sha256:29037381c79152409adacee83448a2bdb67e113f003613663c7589286200ded8
1.32.8: sha256:d5527714fac08eac4c1ddcbd8a3c6db35f3acd335d43360219d733273b672cce
@@ -166,6 +173,7 @@ kubelet_checksums:
1.33.2: sha256:77fa5d29995653fe7e2855759a909caf6869c88092e2f147f0b84cbdba98c8f3
1.33.1: sha256:f7224648451dd4f9f2c4f79416f9874223c286ce41727788965fd0341ddb59c4
1.33.0: sha256:dd416d94850c342226d3dcdce838518b040ccea16548bfeaf2595934af88ef60
1.32.11: sha256:02b25e87a3fe14e9ea74c10d3b1e204d12af30b8ce7ed11af2a985b49ddb0b83
1.32.10: sha256:bfff8f244992162c0491f8f42d807165ed5c685aecfb3e8000412535ad18a873
1.32.9: sha256:fd7711d1f0c1e263e9332004858fc4a6c39462e3e2ee485706eea5297966ed9c
1.32.8: sha256:7dfca4da9cdf592c0f70800e09fb42553765bc0951cade3d6e0c571daf3f23ee
@@ -201,6 +209,7 @@ kubelet_checksums:
1.33.2: sha256:be8412cb9bf30125e3a88ecb9bfca4df1ff5d4e650947c46222683071f1a17d7
1.33.1: sha256:c1bc01115a513eaec76d56dc52a52aeb05f866a6d07c55335c1fff56c868543d
1.33.0: sha256:6fa5abbc14d65b943b00fcfc8a6ac7eb39fd7e924271738c6f17e0b7e74c665b
1.32.11: sha256:17baef329a468f958658f3e4c3f04689dd2506077214e36d4495b8d0c6776da9
1.32.10: sha256:277e68bcf192ea91f3426b8fb540c4951e2e3bffc659a7b39b98c749e828acc7
1.32.9: sha256:81ba713e8b51644336d428dfa5654cc4e2e4a4ea742976b56ddf965a347330e5
1.32.8: sha256:ec5a2e045dc49b7e1d34a0c78fbc645ce568b2275e807b6313da46e584f56f68
@@ -237,6 +246,7 @@ kubectl_checksums:
1.33.2: sha256:f3992382aa0ea21f71a976b6fd6a213781c9b58be60c42013950110cf2184f2a
1.33.1: sha256:6b1cd6e2bf05c6adaa76b952f9c4ea775f5255913974ccdb12145175d4809e93
1.33.0: sha256:bbb4b4906d483f62b0fc3a0aea3ddac942820984679ad11635b81ee881d69ab3
1.32.11: sha256:358dafd910cec676f05e04fbed44ea26ec393cd60b5b885bc60c27e1aaf383c9
1.32.10: sha256:b42bc77586238b43b8c5cdd06086f1ab00190245dd8b66b28822785b177fbde4
1.32.9: sha256:84629d460b60693ca954e148ce522defd34d18bc5c934836cfaf0268930713dd
1.32.8: sha256:ed54b52631fdf5ecc4ddb12c47df481f84b5890683beaeaa55dc84e43d2cd023
@@ -272,6 +282,7 @@ kubectl_checksums:
1.33.2: sha256:54dc02c8365596eaa2b576fae4e3ac521db9130e26912385e1e431d156f8344d
1.33.1: sha256:d595d1a26b7444e0beb122e25750ee4524e74414bbde070b672b423139295ce6
1.33.0: sha256:48541d119455ac5bcc5043275ccda792371e0b112483aa0b29378439cf6322b9
1.32.11: sha256:b1c91c106ec20e61c5dff869e9a39e6af4fb96572bddaac9cce307dfa3ed2348
1.32.10: sha256:1f4229526e16bf9f5b854fbf3bdb9c7040404a29c1d1e4193258b8a73de06e92
1.32.9: sha256:d5f6b45ad81b7d199187a28589e65f83406e0610b036491a9abaa49bfd04a708
1.32.8: sha256:8a7371e54187249389a9aa222b150d61a4a745c121ab24dbcbb56d1ac2d0b912
@@ -307,6 +318,7 @@ kubectl_checksums:
1.33.2: sha256:33d0cdec6967817468f0a4a90f537dfef394dcf815d91966ca651cc118393eea
1.33.1: sha256:5de4e9f2266738fd112b721265a0c1cd7f4e5208b670f811861f699474a100a3
1.33.0: sha256:9efe8d3facb23e1618cba36fb1c4e15ac9dc3ed5a2c2e18109e4a66b2bac12dc
1.32.11: sha256:48581d0e808bd8b7d3c3fc014e86b170e25a987df04c8a879b982b28a5180815
1.32.10: sha256:6e14ef4e509e9f3d1dfc2815643f832f853d2d9f6622d4a0f83f77c7e4014b57
1.32.9: sha256:509ae171bac7ad3b98cc49f5594d6bc84900cf6860f155968d1059fde3be5286
1.32.8: sha256:0fc709a8262be523293a18965771fedfba7466eda7ab4337feaa5c028aa46b1b
@@ -342,6 +354,7 @@ kubectl_checksums:
1.33.2: sha256:d1cdf13cb786c1ee6d5bf6d85034f496aa2fee97b287028043eb14c5dc74993f
1.33.1: sha256:f922dd8f558dc616ebaa34908ceb7964ebb8caadd7c48699d0b791ffff2be1aa
1.33.0: sha256:580d076c891711ec37afaf5994f72a8aad9d45c25413e6e94648e988a5a9933a
1.32.11: sha256:4310edfc10fbc64cc69a25d27a1a8c4e134ad6642f8c83a8b0b612768ac63e84
1.32.10: sha256:544722455bc0a3f57b68e9aafe8bffa0af25d4f0f383848f03ba7aff2cab7e10
1.32.9: sha256:bdc8af9c1aed9737d58442f59034ad0125efe3a2dfad9f6ec14f1264e7020cc3
1.32.8: sha256:52cc07556a8f0076d4e48003aa416b486c729e9679dbe2ea92bbd88e5be5cc93
@@ -378,6 +391,7 @@ kubeadm_checksums:
1.33.2: sha256:21efc1ba54a1cf25ac68208b7dde2e67f6d0331259f432947d83e70b975ad4cc
1.33.1: sha256:5b3e3a1e18d43522fdee0e15be13a42cee316e07ddcf47ef718104836edebb3e
1.33.0: sha256:746c0ee45f4d32ec5046fb10d4354f145ba1ff0c997f9712d46036650ad26340
1.32.11: sha256:0190c49b61b065409b1e99c70e5ec3c52576bf8902432fb2c97bf1d0d2777b69
1.32.10: sha256:a201f246be3d2c35ffa7fc51a1d2596797628f9b1455da52a246b42ce8e1f779
1.32.9: sha256:377349141e865849355140c78063fa2b87443bf1aecb06319be4de4df8dbd918
1.32.8: sha256:8dbd3fa2d94335d763b983caaf2798caae2d4183f6a95ebff28289f2e86edf68
@@ -413,6 +427,7 @@ kubeadm_checksums:
1.33.2: sha256:5c623ec9a9b8584beba510da5c2b775c41cf51c0accdfb43af093bc084563845
1.33.1: sha256:9a481b0a5f1cee1e071bc9a0867ca0aad5524408c2580596c00767ba1a7df0bd
1.33.0: sha256:5a65cfec0648cabec124c41be8c61040baf2ba27a99f047db9ca08cac9344987
1.32.11: sha256:5e191b7329897a16ea87aed75b66f561e7243691620d6b792f34d488285484ce
1.32.10: sha256:1c5033ee113d9072a53ee1ef3a3b18e566721bb3879b49c6813c67066687afbc
1.32.9: sha256:183b3b12e39b3ed2dc2db25cbc17769610cdd5f02e9d1325ba747d54978d8f5f
1.32.8: sha256:da4cc996800db14f82fce8813caa55be318e52ef69d82e50e728ef4cfa18b69f
@@ -448,6 +463,7 @@ kubeadm_checksums:
1.33.2: sha256:1b818900ac7af72a14f50300d6c6ad600eecdc578c37b75fa488cc654ca08c25
1.33.1: sha256:a772834ba22478c9119f03ecca2a27a70234623d74ff1d7671ee85675a4e830b
1.33.0: sha256:26cb7ac57d522a59c84c4784b176097d23c7b4e61874fab84ae719d0e43ac0bc
1.32.11: sha256:c7bb0bbac734290666f6deaba731f4eae46045c94ae53501153e4167dad51d34
1.32.10: sha256:5cfda89b98b6308f4d28e77eabc0111c3eb3c7b64baccf644ecdbcac90b258d0
1.32.9: sha256:fcc5aa3401d130156e0b73dab192631108b77e778f3d87838419993aea1ef8d5
1.32.8: sha256:b5e4f0da030de98f1179a148f6563d69fbfb4c35c2dd1de1d30f000805d12412
@@ -476,6 +492,7 @@ kubeadm_checksums:
1.31.0: sha256:002307ea116a5aa5f78d3d9fb00e9981593711fb79fdfc9be0a9857c370bdcf3
etcd_binary_checksums:
arm64:
3.5.26: sha256:93ac1667df0e178ea6d152476ce4088df4075604fe4bc7f85f4719e863cd030b
3.5.25: sha256:419dce0b679df31cc45201ef2449b7a6a48e9d241af01741957c9ac86a35badc
3.5.24: sha256:efc01f6b3fbef0f000cb53bcad4845c116d7fdd8769ca39d9c40d2fe4d2e509f
3.5.23: sha256:d95118595a9556a29775f99c1ef2ee16f6be113df76b7178643a2bfd6a6f37c3
@@ -497,6 +514,7 @@ etcd_binary_checksums:
3.5.7: sha256:1a35314900da7db006b198dd917e923459b462128101736c63a3cda57ecdbf51
3.5.6: sha256:888e25c9c94702ac1254c7655709b44bb3711ebaabd3cb05439f3dd1f2b51a87
amd64:
3.5.26: sha256:0a682a91201dc8351d507210bc30b021a11e254eab806f03224b51e8fad29abb
3.5.25: sha256:168af82b59772e1811a9af7b358d42f5c6df44e0d9767afb006ecf12c4bbd607
3.5.24: sha256:042497e2ddcee06f22e5d486d81f58affa26b53ee423e2a6aaca3d3ea98c8191
3.5.23: sha256:8b50d62d38cb2de005b42227dd14b33f6e01758970f248f9257789ecfaf634c9
@@ -518,6 +536,7 @@ etcd_binary_checksums:
3.5.7: sha256:a43119af79c592a874e8f59c4f23832297849d0c479338f9df36e196b86bc396
3.5.6: sha256:4db32e3bc06dd0999e2171f76a87c1cffed8369475ec7aa7abee9023635670fb
ppc64le:
3.5.26: sha256:9678ddaced9fcd4878b76b0b76c9c2a3638a70bdc362c9f4cb25ecc48de2c6d3
3.5.25: sha256:0dee64e99a43a06dd9541a40a18b52c7309eb1682a2a32740d4bdf358296c007
3.5.24: sha256:4b252266a59a00c0f608f481c836fb469d2cd0f60ecbc119c4f1fe0611910ab1
3.5.23: sha256:9b9caa29715f387633c6f3639efd3b85284665f7e8632551c904c9859edbb7a8
@@ -581,6 +600,7 @@ cni_binary_checksums:
1.3.0: sha256:8ceff026f4eccf33c261b4153af6911e10784ac169d08c1d86cf6887b9f4e99b
calicoctl_binary_checksums:
arm64:
3.30.6: sha256:47ecc00bdd797f82e4bac0ff3904c3a5143ba2d61e8ae1cbbce286ca76d3790a
3.30.5: sha256:7611343e7a56e770b95e2bb882dda787efbbd4331b1dd6316ff8ea189238dfaa
3.30.4: sha256:b21fbbc55b6f5d50c1c0faae714242cae3e013185cb8e26ce56981bd10da260d
3.30.3: sha256:2ae0474b88a6042e5489d7410d2669a9d443c9d5c51e2bdc8ebe4d6dd98f2475
@@ -602,6 +622,7 @@ calicoctl_binary_checksums:
3.28.1: sha256:c062d13534498a427c793a4a9190be4df3cf796a3feb29e4a501e1d6f48daa7c
3.28.0: sha256:c4ca8563d2a920729116a3a30171c481580c8c447938ce974ce14d7ce25a31bf
amd64:
3.30.6: sha256:2017e19727dca689d8bb73a9d8dff3c6a8ba7d8c75049f99ee207272161b5749
3.30.5: sha256:6cdfb17b0276f648f4fdb051a5d75617a50b3c328d4cccfc40d087b96c361d80
3.30.4: sha256:7e2e5e75b25c55683b68eabeb9b00390b1d359e72bf57f7ec2b76bb006fd175f
3.30.3: sha256:a7d017d1abf6ef5d6e03267187c0dd68c32f5e937b64decd29d003be44fa6b94
@@ -623,6 +644,7 @@ calicoctl_binary_checksums:
3.28.1: sha256:22ec5727c38dbe19001792b4ca64ac760a6e2985d5c1a231d919dbebe5bca171
3.28.0: sha256:4ea270699e67ca29e5533ddb0a68d370cb0005475796c7e841f83047da6297b6
ppc64le:
3.30.6: sha256:9a9c368499b1e3d08418dfbb566379483e15c50d08dd1bcaf6148c115d82ed36
3.30.5: sha256:5b6de49da1af2633549bff5e8f4d8a573a175b65c47c29d327ef6a0760d39a93
3.30.4: sha256:8fc8ef492d463e184e714bc6d31b05f9066c8af3445928efef233850f036bb92
3.30.3: sha256:ccd13ced62baf633fb4347fbe6c9fdc0d3b1b7deb1794c83c015507a0cb8238e
@@ -738,6 +760,7 @@ ciliumcli_binary_checksums:
0.15.15: sha256:492279c1f960c79747290a5d1e1b21084a04a93f9e13ab4ae7df4c76fe808aff
calico_crds_archive_checksums:
no_arch:
3.30.6: sha256:d61aa5bcddfc78b0094acd54e0358009fa79e1cbe6d8c23bdacb34ff7a2c6c82
3.30.5: sha256:3a38f91596c204b43c70f642a3e686d8c3fbfdfa5caa7824b716aa2f4a4e568b
3.30.4: sha256:a9398f6de6cce8f683e0ad649a21f3d3b8bb5fe4cd26e7b26b33b9a8c740274f
3.30.3: sha256:36c50905b9b62a78638bcfb9d1c4faf1efa08e2013265dcd694ec4e370b78dd7
@@ -817,6 +840,8 @@ helm_archive_checksums:
3.16.0: sha256:d13a4b87b31a5b50c8d93dd9988dfb312a61e56504102f466a4004e5a3ab8e9e
cri_dockerd_archive_checksums:
arm64:
0.3.23: sha256:a78037d2d2e9c52c48372a5cbba7b94b1c57be5759449beef29cfe03cbe6f14b
0.3.22: sha256:3260b214c9b12dbf0cbf4d60410c45aacfc31ba52aa7b74164135968e8950cb6
0.3.21: sha256:35de6b1e8eba11d8ba6d71fa7499cb3d610a1e7b866c9d43b7f87029e3a769cd
0.3.20: sha256:e6b4661c51c832ee1cbbb75d1c8b086fa803acc153d400454c3b8cf324547d89
0.3.18: sha256:d16204a4f01685ba67319adb3acc6a6f3e62d8bcfd87bc67f5e08f7332515a9d
@@ -834,6 +859,8 @@ cri_dockerd_archive_checksums:
0.3.6: sha256:793b8f57cecf734c47bface10387a8e90994c570b516cb755900f21ebd0a663b
0.3.5: sha256:c20014dc5a71e6991a3bd7e1667c744e3807b5675b1724b26bb7c70093582cfe
amd64:
0.3.23: sha256:c7fe5db7f9396186193b58ded0e62a31eca7b3c58ad8691d57017986f96482ee
0.3.22: sha256:6621a96a885c82844d12318de00f510eae3459871cf1ad47317f38dd242f9a03
0.3.21: sha256:6c35838bc4b1aef74f9113670e114ca729a5f295f9457b226791e18e86e91698
0.3.20: sha256:2ce46d6bbd7f6a7e06e211836c201fdc2311111913eccc63a03f6ef4fe1958fc
0.3.18: sha256:937578ddcdb28c71afded3fda25d555e0c9e6d396668977ff98228d55886dc79
@@ -1000,6 +1027,10 @@ kata_containers_binary_checksums:
3.2.0: sha256:40627b7ac677ce0f5ffc73b32c1a8bc553e75b746b6cdf8f14642ac27dac3148
gvisor_runsc_binary_checksums:
arm64:
'20260112.0': sha512:3b7925d26d71fdcb8cb552950c88bcfed658c06ad6b1211906bfe86d13bc56d8005ac90a4d9ab4c8b6a48eb62ec51ebcdfd45a64067ac5190274e710961e51ea
'20260105.0': sha512:cc98ad73e8d181f4738c97883180bc76cf8b2eb773c11f3a44f1636d0b0e00f2ee9228e4eecd414f94d6410f4877e6c93260b8070130fba767583026115d1038
'20251215.0': sha512:5e7d6206bce4164c9109d37dfb0b169d1c59cc256910de42799a868c3f9ba5560ef5c05c0de3fad4f0856f906463588ff25c9bce3b25e0d3f20874521dffe767
'20251208.0': sha512:db07dc2def9b1e0b13e17bec5f98e9cd794159955ac999432fad16d1ec747924a05cd5e854b4d45f11147c090208c0ce7d915a0734cf2960047bd4daaba0465b
'20251201.0': sha512:fb527cea4d165478f297a918734f10acabf5230a4a0d29b19709cb6a69a389d32c2a0da328146f72ef0d8776aca35d97647db82ff46be60e85ad02305f631896
'20251118.0': sha512:80a2970cb966d69d59313ec64583174189293db24605f8309a9e8b230e3be6f0e7e387bc11dd5db1896a8c308dd81da8778c0f0418d7ebfdda6b26f03c8d499f
'20251110.0': sha512:297d42a46463d5b68c4786bcd448fe0914d7af91cd62f3c51b494b94e1c91d7eda68c6f21cfb1a66ac4d45aadcf20f7410291c0f3f17b090799f9cfa676cb563
@@ -1061,6 +1092,10 @@ gvisor_runsc_binary_checksums:
'20240109.0': sha256:51a1b299997834b902192806def688b1e23ff6b14f28a9ed3397f3f6572a189a
'20231218.0': sha256:86262a78946deacc309c0f08883659ee3298c288048dc30955945e71993c81a8
amd64:
'20260112.0': sha512:b36de90cdad4cfe0b9b66318407da79c035dd6dcf4c1374250011f34e511c0a29e335fe04eabb0d3fe7140131925f619f724a4702b37c49557bdeb25924b4dc8
'20260105.0': sha512:15c8adabc9f1006d469177b0ec3962d4993e01c85be17d381a4979029eacc7db37ef354e3eafd279573135a1adf81baffc5c19f2bbfac932c79386f6ac74e52f
'20251215.0': sha512:ea82bb66ce61a80adb6edaa61e2f2b1cd6339c504a55dd6663555010ed7f96c6234ac787bd9ecdb29ed4058e806e829fa45f14093466913dafc44d56055a5acb
'20251208.0': sha512:4b9a29a6f887aedbc10de5f5f0900eb64026c3472b5522ee21a6d2b3d30ac3ebc084a78b97e371d3bf830dcba4f61a5809922ea768650d52ef120221b4a9b19f
'20251201.0': sha512:8534bc833d9b1e286b8876abb17dd6fe202c40a75a36dd62b0ce892bf9dceb1773e71447848e7acab120ce99283c22d2f4e4a6171008c9c5f3d5fe6ad6f1cc75
'20251118.0': sha512:cc95eec3e22a574ac533278ee8c72672542edf0ab467a89c13f02abea6404ffe20ea4a538a3482b072a8e45222a13cefbf9e7f44bada35b436769e04b12ab970
'20251110.0': sha512:40d9ec839850cb1994321f0716026b6149adb712bab576b157be2c31b832e68e11475647b2776499fca4c52e96fc7489877fff2fb1985d5f1e128d14f776bd6b
@@ -1123,6 +1158,10 @@ gvisor_runsc_binary_checksums:
'20231218.0': sha256:c353d36a134dfc2fab8509f72a34abf6a761603975eb00a39e4077c41aeaf31b
gvisor_containerd_shim_binary_checksums:
arm64:
'20260112.0': sha512:3215952718bd1636173649c4742e3d8e1978c410abd71bb8252c8ad6d28130cb6d66684aa089f61a0eda0b8786553620a08a9f1b5ab824bb27b1b0cf47bfb25b
'20260105.0': sha512:cfe8a07c304dca21171e5a76614ac3605f5b1ec8f9ed2eeac014a44bc00821864f219db0e25fcc1c56cedbe335bbf34a7fa6bc57335888dcd04278bc0263f5cc
'20251215.0': sha512:2b3a00ec2d646a1c26c1944781b5caf039ce7035dd72281ccff8e244af55606e01667de311febee1a0a03ebd2633af6ebb0ad72d27b8a966743ffe31563b3a5a
'20251208.0': sha512:f3a6d9ff32dae45c62ae831580e5dfbd28fed38f1ca9daf09e6a9960a5373da7e29fbf61e0846676102f053ed38f23a0ad41349f5326fb3a2991b296d33c853a
'20251201.0': sha512:9546236a7ddad9a2ccd51c41f2f309b7f4016fdf489581f77b1b803ed73ca72501af2de3e3d0b58daa633384baa0d46ecd515760165ed51bfb6c0900649c6306
'20251118.0': sha512:0179f0b049c882703758d5cba387e1e4fd0300aef20197e35e2886f480f0668fecb8deb3aa84341d0b874127d88b337fbcf609f563c3310b47520e8144e9d55a
'20251110.0': sha512:8f2b16ad59e9ffdafd1218851cba9d007d4ffb15c5ec2003e0c691eb048935a82f9e8b578c051b05738e3b4e1f141ed893c73415313ec639f348fe989659b893
@@ -1184,6 +1223,10 @@ gvisor_containerd_shim_binary_checksums:
'20240109.0': sha256:40eb0a4f5f0013afb221e228fd6e71887127c4b09c7f2eb36705a0cd5c746d57
'20231218.0': sha256:5f66938de981221359a64f05a5c770b228090db3a2697d91ad622c18dd19f4b2
amd64:
'20260112.0': sha512:89f55750488559796fe51d2c10c289a8b0617fb9f6498714c026825268eeed449941d23e8cd5b285b69c1b032005ddeec278345198301c50d89ff6d3f66871a5
'20260105.0': sha512:7f3f5a864fda5f4e2de9db20dd5edad60b6aa467cc7c22d13f40cdce811783d66018f2c28fb74b907c6d6ac0e39f6d0e1047f1f33447b8a8682f1fbaa25edeb4
'20251215.0': sha512:538a04d88a39de1679afd9868806bd5fdc63737a4871955fc8a8c8e183942c6cc3dbd6b34b2f5589f5f474b4826427f149d5c6abec4ca8d09db363ff5f149b4f
'20251208.0': sha512:8f1e41374785bdfdf69c5798cfbdec53a833ff6724d36dd644a387b2f6e151c513389b2e5b3c0d5347c5c2ff214910db3c8c164b4d5bb2fe963c5a5eb70ca1a4
'20251201.0': sha512:216a937437cb1747d5e84edd9ae7274c5a2c4f712f4601e7e0ca06e0a688bebfac267707028b78845276302023d305ec9a93f5b200c9c3c3cdf86a2f41817703
'20251118.0': sha512:b0f0fa1ee431c63cfbb9007a62c49a374bcbfbbbf5997e63c827d1673f6933d65044ca4f06608bb494f870ced97295dc065810f5e905dfd4a632fe4d61faff7f
'20251110.0': sha512:56a27dab74191db97f888c936b53861248851a2579d838073f528db7cb9353da5a919a27a38a48447b0a81bd42ab92873c480be769a9818a464ba9cf27872581
@@ -1331,6 +1374,7 @@ nerdctl_archive_checksums:
1.7.0: sha256:e421ae655ff68461bad04b4a1a0ffe40c6f0fcfb0847d5730d66cd95a7fd10cd
containerd_archive_checksums:
arm64:
2.1.6: sha256:88d6e32348c36628c8500a630c6dd4b3cb8c680b1d18dc8d1d19041f67757c6e
2.1.5: sha256:fe81122c0cc8222470fa3be51f42fa918ac29ffd956ccd2fc408c1997babd2ca
2.1.4: sha256:846d13bc2bf1c01ae2f20d13beb9b3a1e50b52c86e955b4ac7d658f5847f2b0e
2.1.3: sha256:7e423abc7bf52ff6cb724f44995cca335b40331efa727415a5efc99ca34ac8d5
@@ -1345,6 +1389,7 @@ containerd_archive_checksums:
2.0.2: sha256:14a2a9f7f75f73e5bcfb8b183d0b84830c54b98ef8c5f6ed70e51f1a230c673e
2.0.1: sha256:b07120ae227b52edfdb54131d44b13b987b39e8c1f740b0c969b7701e0fad4fa
2.0.0: sha256:2a00b1553f38aa9e716d61316b661961c2fbfbb7aad7bd73b377be5725ecc0f1
1.7.30: sha256:a09c3b01b3b6935e839c8a9588b5528c57ebfca4747d816654a7d1e7575c0a63
1.7.29: sha256:176d523a6d6dc5520e0c35b8eb0de6e54bd4d00486d5fbbeb4163f30e2962f17
1.7.28: sha256:97457594ff8549cb82d664306593cafd3d2c781c706f9fffed885a46d8919bec
1.7.27: sha256:3f03ea60c7dacddf890be3ab18f7ef859d9d104b19627f52038d7984361912bc
@@ -1401,6 +1446,7 @@ containerd_archive_checksums:
1.6.15: sha256:d63e4d27c51e33cd10f8b5621c559f09ece8a65fec66d80551b36cac9e61a07d
1.6.14: sha256:3ccb61218e60cbba0e1bbe1e5e2bf809ac1ead8eafbbff36c3195d3edd0e4809
amd64:
2.1.6: sha256:4793dc5c1f34ebf8402990d0050f3c294aa3c794cd5a4baa403c1cf10602326d
2.1.5: sha256:403af72d9f956ed8a5ad5b0ac0f1e8e371a1488f2b9edf9b4ba13db0653936ea
2.1.4: sha256:316d510a0428276d931023f72c09fdff1a6ba81d6cc36f31805fea6a3c88f515
2.1.3: sha256:436cc160c33b37ec25b89fb5c72fc879ab2b3416df5d7af240c3e9c2f4065d3c
@@ -1415,6 +1461,7 @@ containerd_archive_checksums:
2.0.2: sha256:9bd5b6a1bdf505d520d9a329c520258ed0a17faa9fe3db12712ee858ad59aae3
2.0.1: sha256:85061a5ce1b306292d5a64f85d5cd3aff93d0982737a1069d370dd6cb7bbfd09
2.0.0: sha256:6f8da716941f7e89315cefaa6e5a8f1ff10b323ff46611313c455df7ab1ebee1
1.7.30: sha256:ca0f27e34411504acd1cd24fcbaa71b9c47d31ce9408c47c54a2dc1810ceb1df
1.7.29: sha256:71d9f6e4ea4a9e108e2172b0e7f6fa137e086808db3e6874dbdb91c01102d3d4
1.7.28: sha256:7a8c262deb63becc877e82d23749e4f99f4a17e8e660f9b8c257ca87a5c056b6
1.7.27: sha256:5b038fb22ab5dbb1ce57dd3d8f102460cd8619ff2afc78870837b06e8c4e840a
@@ -1471,6 +1518,7 @@ containerd_archive_checksums:
1.6.15: sha256:191bb4f6e4afc237efc5c85b5866b6fdfed731bde12cceaa6017a9c7f8aeda02
1.6.14: sha256:7da626d46c4edcae1eefe6d48dc6521db3e594a402715afcddc6ac9e67e1bfcd
ppc64le:
2.1.6: sha256:aef2b639a14ae79f2bbe43356b25e84ecfb2c7f269c87f41e41585e724073e54
2.1.5: sha256:dc95edc01958d18f8475ab4d415e8c92cb3bad580167db8b0054374fd9004f78
2.1.4: sha256:d519e40e266f39cdd68f2c31e2e4e9b70eda09b96f3c3de343a7a3e11d49ad4c
2.1.3: sha256:e517a6d936ffb6d2292e9c6560aa363382b1457eba34cad8289f6f3f76201588
@@ -1485,6 +1533,7 @@ containerd_archive_checksums:
2.0.2: sha256:1b19d31bb8a7f9d26d9b50675e78f397d0b01fa635c33cca456f91c412fa6df1
2.0.1: sha256:09a25357343c7336fe519e5fd1a9dd0f22da869e9deda50c2bc61b6e8c9384be
2.0.0: sha256:2e7f4b15ac85c22c1ced102bbb424124078248f0af3183425ff335a998079809
1.7.30: sha256:2edba94f31ed0c32cde0cca2e79e4426224eece4ef4fa1f194b2df4c811417fe
1.7.29: sha256:8c08edb22b53a44cf7b5a5e31ca8fb86d1d6df378724048ae0d9890a3b66f081
1.7.28: sha256:e8f64abf81503aeee0db0d5682197e9ce377ffeb858313c5bc9fc3d7faa4b85f
1.7.27: sha256:ccdfa16e4bba3a993d74fac794d22ddadc1013d351cd099ea933827050ef05a0
@@ -1542,6 +1591,7 @@ containerd_archive_checksums:
1.6.14: sha256:73025da0666079fc3bbd48cf185da320955d323c7dc42d8a4ade0e7926d62bb0
containerd_static_archive_checksums:
arm64:
2.1.6: sha256:9da292010d36d80afa3bb48cbd15f65d3bf38177217060272a1c3fd65351cfa4
2.1.5: sha256:d1a1e64c4334e17d6f9f40093d5ff9f810b95ac34c7dcba55e7d2226d2a8ab79
2.1.4: sha256:c5f0957064e6ed5a67905ea3f8e451dadd16530334b86baaad678dd357205c30
2.1.3: sha256:74703e628223c6f19ab2df8497a061d08dc1b81c03c720cbc3d66fedaddf9ca5
@@ -1556,6 +1606,7 @@ containerd_static_archive_checksums:
2.0.2: sha256:da5631d38702257674da542969c285adfdbdccffc12c5ea39de4db6113de51bc
2.0.1: sha256:6022bd160b6d83f13fccd87b1c3854b0294a940335fdf014c10c80d71b279115
2.0.0: sha256:428cd0b08eab57003db8a98742d8404b4b69dcd335c5f0f66ceec5fe3b9b31b6
1.7.30: sha256:1458b3c4a20eb1305bb1d4680c660d0726fa7709acdd95ec7c0fd6bdd632c36a
1.7.29: sha256:d428ad3fb7036c07c141f6666de575d8c7fd862e658a28da5ae20b737fdf7d75
1.7.28: sha256:360307a347f1114d6f67fe261674033661c56e6deb3c2529705fc834aba98ff1
1.7.27: sha256:ce2b308e81dc1e633362a45b64fbefc58a7c521e7e060cf95a7709bc1704b402
@@ -1587,6 +1638,7 @@ containerd_static_archive_checksums:
1.7.1: sha256:f0435e7cda3c3abc40d3f27d403a8e24bd0b927a8a893a7e4dfaec5996fa9731
1.7.0: sha256:6e648cd832f026e23eb6998191e618da7c1ec0c0373263d503ff464e0ae3977a
amd64:
2.1.6: sha256:577900a5a8684c27e344aeeb1fc64e355745f58cba7f83c53649235ba25abbbf
2.1.5: sha256:9389a4bff0112258bd953c66382709c2a4a11e25e376c8c4c7075b39de156882
2.1.4: sha256:50e53500800f4f74d0d8b2e57e939297eab68b0fe11a0957b771d5faa61fae5b
2.1.3: sha256:1bb0c910e8fdf623fac2305ec66e72c4afbf612de282577dfdbebf08360937d5
@@ -1601,6 +1653,7 @@ containerd_static_archive_checksums:
2.0.2: sha256:7cd4de9c8ad37f3248a45b9d6347b7628e4d77d1c8e35c1f80343450fa47dc00
2.0.1: sha256:22b2a7df86fe3e53e219af22a2b5e81d1b67e67d55ec3a18f89990b161fb2157
2.0.0: sha256:e72cc69db9984a8d46a34495c302d2b50188ee2dd5c7000a7b471d0350e14ad1
1.7.30: sha256:290eea517a7aa919aa319563f8270d284ab5197736362285da09b198b76a34b5
1.7.29: sha256:ba4a2e919d2882dba707d8c9c436ed3496e3bb99f5cfb13a31b29b497cf7ec2e
1.7.28: sha256:659eceb8a5831a704089825d980f63abfb686650646751e13f9f9fac780ea08e
1.7.27: sha256:e3ea27eb0e7b8dd92ba7a5ecdd363372ad3c30f9cdecbec75e60ae4a43ca93b9
@@ -1632,6 +1685,7 @@ containerd_static_archive_checksums:
1.7.1: sha256:8b4e8ed8a650ea435aa71e115fa1a70701ab98bc1836b3ed33341af35bf85a3a
1.7.0: sha256:64ad6428cc4aca486db3a6148682052955d1e3134b69f079edf686c21d123fcd
ppc64le:
2.1.6: sha256:c64312b87181d900452b5c3360a90578acd39ec7664d0c2e060183b24a708766
2.1.5: sha256:4404b918b9e101274baa072188054766a1af16be8d22f02a51a5f6ee4e5d159f
2.1.4: sha256:9bc1ac45ba197873a4d47045313e0cc55910802937739bf57aded125abe55c8c
2.1.3: sha256:0084e26bcf5a2653278766662d5adc27cb00a17a21413cc3fbe1e99d9dacf174
@@ -1646,6 +1700,7 @@ containerd_static_archive_checksums:
2.0.2: sha256:4085c490ad5afb40b34782d427130396513a7eb35a49da8a6b7ade946dd309ce
2.0.1: sha256:6e8608e1993099f4ad44d77765be76fd67147108a6247f6886af33be08287a84
2.0.0: sha256:c389b68c9ca7774efdbc9e479d9a3be14b71d60a2a8c23f1d8764ec9598f3d55
1.7.30: sha256:95f12a656977f516b257f6447c197a30bd75b8b9d494a741a7f1f486a069de8c
1.7.29: sha256:5f4a351d4ecf0aedda743b2ea4b7304a72cf534f4f83c976de3a0d5adf006445
1.7.28: sha256:94dcce54f82bcd34a171af082cd035163cbb281923afc09454e0539eeef73974
1.7.27: sha256:6a140413dd38954078dddb1b1405e403a92173fe15d094fe0bcbac93a1ff7039

View File

@@ -4,7 +4,7 @@
# noqa: jinja[spacing]
no_proxy_prepare: >-
{%- if loadbalancer_apiserver is defined -%}
{{ apiserver_loadbalancer_domain_name | default('') }},
{{ apiserver_loadbalancer_domain_name }},
{{ loadbalancer_apiserver.address | default('') }},
{%- endif -%}
{%- if no_proxy_exclude_workers | default(false) -%}
@@ -32,7 +32,7 @@
- name: Populates no_proxy to all hosts
set_fact:
no_proxy: "{{ hostvars.localhost.no_proxy_prepare }}"
no_proxy: "{{ hostvars.localhost.no_proxy_prepare | select }}"
# noqa: jinja[spacing]
proxy_env: "{{ proxy_env | combine({
'no_proxy': hostvars.localhost.no_proxy_prepare,

View File

@@ -21,6 +21,10 @@
- "{{ bin_dir }}/etcdctl"
- member
- remove
- "{{ '%x' | format(((etcd_members.stdout | from_json).members | selectattr('peerURLs.0', '==', etcd_peer_url))[0].ID) }}"
- "{{ '%x' | format(etcd_removed_nodes[0].ID) }}"
vars:
etcd_removed_nodes: "{{ (etcd_members.stdout | from_json).members | selectattr('peerURLs.0', '==', etcd_peer_url) }}"
# This should always have at most one member, since the etcd_peer_url should be unique in the etcd cluster
when: etcd_removed_nodes != []
register: etcd_removal_output
changed_when: "'Removed member' in etcd_removal_output.stdout"